Wikipedia:Bot requests/Archive 82

From Wikipedia, the free encyclopedia

Archive 75Archive 80Archive 81Archive 82Archive 83Archive 84Archive 85

WantedPages is pretty useless as it is since it considers links from and to talk pages. Does the requested action above help at all? JsfasdF252 (talk) 03:06, 5 January 2021 (UTC)

A better solution has been discussed and requested in Phabricator. Until then, WP:Most-wanted articles may be a more helpful alternative. Certes (talk) 10:58, 5 January 2021 (UTC)

Moving old WP:FFD pages

Some older WP:FFD pages are titled WP:Files for deletion instead of WP:Files for discussion. Should a bot be created to move them to the newer title just like how WP:Votes for deletion pages were moved to WP:Articles for deletion for the sake of consistency? P,TO 19104 (talk) (contribs) 15:54, 23 January 2021 (UTC)

Try seeing if WT:FFD is interested first. --Izno (talk) 17:55, 23 January 2021 (UTC)
@Izno: I just posted there - see Wikipedia talk:Files for discussion#Discussion at Wikipedia:Bot requests § Moving old WP:FFD pages (contrary to the title it also invites response there). P,TO 19104 (talk) (contribs) 23:23, 23 January 2021 (UTC)
Happy to code this if there’s consensus. ProcrastinatingReader (talk) 18:51, 24 January 2021 (UTC)
I don't expect a lot of input at WT:FFD, so I have started on RFC on this. P,TO 19104 (talk) (contribs) 13:13, 25 January 2021 (UTC)
Archive 75Archive 80Archive 81Archive 82Archive 83Archive 84Archive 85

StarWars.com

Anything with http://www.starwars.com should be changed to https. JediMasterMacaroni (Talk) 17:03, 25 February 2021 (UTC)

Please post to WP:URLREQ. – Jonesey95 (talk) 18:18, 25 February 2021 (UTC)

FANDOM

FANDOM to Fandom (website), please. JediMasterMacaroni (Talk) 00:57, 24 February 2021 (UTC)

There is no reason to change redirects. Primefac (talk) 01:06, 24 February 2021 (UTC)

Replace Template:IPC profile with Template:IPC athlete

There are some 800+ transclusions of Template:IPC profile. They go to an archive page, because the original link doesn't work, but with the first five I at random checked, the archive page doesn't work either: Scot Hollonbeck, Stephen Eaton, Jonas Jacobsson, Sirly Tiik, Konstantin Lisenkov.

It seems possible to replace the template with Template:IPC athlete: {{IPC profile|surname=Tretheway|givenname=Sean}} becomes {{IPC athlete|sean-tretheway}}. It is safer to take the parameter from the article title than from the IPC profile template though: at Jacob Ben-Arie, {{IPC profile|surname=Ben-Arie|givenname=<!--leave blank in this case, given name not listed-->}} should become {{IPC athlete|jacob-ben-arie}}.

If the replacement is too complicated, then simply removing the IPC profile one is also an option, as it makes no sense to keep templates around which produce no useful results. Fram (talk) 11:34, 5 January 2021 (UTC)

@Fram: if I’m understanding you right, is the template totally redundant and should all transclusions be replaced with IPC athlete? If so, you can just TfD the template, then an existing bot with a general TfD authorisation can easily do this task. It’s also probably faster (it’ll probably take at least 7 days for community input + BRFA for the task alone otherwise). ProcrastinatingReader (talk) 00:33, 6 January 2021 (UTC)
Thanks, I'll bring it up at TfD then, didn't know that their "power" went that far (but it is a good thing). Fram (talk) 08:23, 6 January 2021 (UTC)
Primefac given TfD is closed, can your bot action this? ProcrastinatingReader (talk) 16:05, 2 February 2021 (UTC)
If it's at WP:TFDH, it will be actioned. Primefac (talk) 17:30, 2 February 2021 (UTC)
I've done 300 of these and can safely say that it's too complex for a bot, but quite quick with AWB. --Trialpears (talk) 17:28, 1 April 2021 (UTC)
Somehow I managed to do the other 500 as well. I guess it's done. --Trialpears (talk) 22:33, 1 April 2021 (UTC)

Convert to cite Twitter

I would love for a bot or script which could allow users to turn a page's citations which include a URL to Twitter into instances of {{cite tweet}}. MJLTalk 20:17, 3 February 2021 (UTC)

  • A script would likely be better, unless it can safely be ran on all Twitter URLs in ref tags (in which case a bot may be acceptable). ProcrastinatingReader (talk) 21:10, 3 February 2021 (UTC)
  • Why do we cite Twitter again? :) --Izno (talk) 00:42, 4 February 2021 (UTC)
    • This is not an appropriate task for a bot, given CITEVAR. – Jonesey95 (talk) 16:22, 4 February 2021 (UTC)
      Wouldn't this just be an extension of the work already done by Citation bot? — The Earwig ⟨talk⟩ 16:54, 4 February 2021 (UTC)

This bot is needed for fixing the grammatical errors. I noticed there were a no. Of grammatical errors in pages which was not attended by any users or administrators.  Preceding unsigned comment added by Kohcohf (talkcontribs)

You will have to be much more specific in what you are requesting. See also WP:CONTEXTBOT. —  HELLKNOWZ   TALK 15:22, 18 February 2021 (UTC)

Create and maintain a category of pages in draftspace that are not redirects

Trying to browse through and improve the drafts at Special:AllPages/Draft: is hard because there are so many redirects. Is it possible to create and maintain a category of drafts that are not redirects for easier navigation?

I guess you could use Special:PrefixIndex/Draft: which will give you all pages in the draftspace, and you can filter and remove redirects. Primefac (talk) 01:42, 19 February 2021 (UTC)

Edmonds Community College

Edmonds community college is now known as edmonds college. I request a bot that changes all articles saying Edmonds Community College to fix the wikilink to change to Edmonds College. This is a college in Washington state, USA. 2601:601:9A7F:1D50:1D79:66DB:8571:CFA7 (talk) 09:07, 2 April 2021 (UTC)

Not a good bot task. For example, here the text should remain Edmonds Community College, as that was the name of the college at that time. Manually changing those cases where the text should be updated is the way to go. Fram (talk) 09:12, 2 April 2021 (UTC)

OK 2601:601:9A7F:1D50:1D79:66DB:8571:CFA7 (talk) 09:15, 2 April 2021 (UTC)

Internet Archive

Links die on the internet. I request a bot that checks if a citation reference to the Internet is mirrored on the Internet Archive to then rewrite the link to its link in the Archive as that won't suffer from Link Rot. This preserves references on Wikipedia which may vanish over time because of Link Rot. Big job! I know! Smart coder needed. 09:13, 2 April 2021 (UTC)09:13, 2 April 2021 (UTC)09:13, 2 April 2021 (UTC)09:13, 2 April 2021 (UTC)09:13, 2 April 2021 (UTC)2601:601:9A7F:1D50:1D79:66DB:8571:CFA7 (talk) --- This could be used to really fix the dead links backlog by hunting to see if the link was mirrored and then rewriting it to the archive to show the link. --- A second idea is to check if the link is mirrored and if so then to add language giving a backup link on the archive which says to go to the link on the Archive if/in case the first reference link has died from being removed from the Internet.

This bot already exists * Pppery * it has begun... 01:04, 4 April 2021 (UTC)
nice!!! Can somebody tell the bot to go deal with dead links? Ok I see cyberpower runs it I’ll go ask them 04:34, 4 April 2021 (UTC)~

Request for bot called "EnergyBot"

I want to request this bot called "EnergyBot". EditJuice (talk) 16:16, 7 April 2021 (UTC)

EditJuice, and what exactly would this bot do? GeneralNotability (talk) 16:16, 7 April 2021 (UTC)

The bot would make talk page archives. EditJuice (talk) 16:19, 7 April 2021 (UTC)

@EditJuice: Do you have particular talk page archiving needs that can't be handled by one of the existing archive bots (see also instructions for setting them up)? Vahurzpu (talk) 16:58, 7 April 2021 (UTC)

No, I don't even have an archive. I request the bot for later, when I will have an archive already. EditJuice (talk) 17:15, 7 April 2021 (UTC)

Then just use one of the existing ones. Headbomb {t · c · p · b} 19:14, 7 April 2021 (UTC)

The site sentragoal.gr has been hijacked by a gambling site and we should be looking to deactivate active reference links to that source. If someone is able to manage that easily, that would be fantastic. — billinghurst sDrewth 23:55, 13 February 2021 (UTC)

By which you mean that links in CS1/2 templates should get a parameter |url-status=usurped and other links should... have what happen to them? --Izno (talk) 00:44, 14 February 2021 (UTC)
There are only 141 instances of the text "sentragoal.gr" in articles. If you look at Olympiacos F.C. for example, you see that the reference is through a {{webarchive}} link. I suggest that the best outcome would be to supply archived urls for as many as possible. Isn't there a bot that tries to recover dead links by that technique? Maybe it could do this job with minor modifications? --RexxS (talk) 01:23, 14 February 2021 (UTC)

Notice of this request has been posted at WT:LANG, and has received only positive comments (thanks or text).

I formatted an example by hand at Dâw language. There are a bit over 3000 URLs to link to. They provide demographic data and reliable sources for the languages, and are an alternative to Ethnologue, which is now behind a very expensive paywall. (And in some cases ELP is a check on Ethnologue, as the two sites often rely on different primary sources and often give very different numbers.)

Last time I did something like this it was handled by PotatoBot, but Anypodetos tells me that's no longer working.

Goal

Add links to the Endangered Languages Project (ELP) from our language articles through {{Infobox language}}, parallel to the existing links to other online linguistic resources (ISO, Glottologue, AIATSIS, etc.)

Data

The list of ELP language names and associated ISO codes and URLs is here. I would be happy if the entries in the table with single ISO codes were handled by bot. I can do the rest by hand, but see below.

There are three columns in the table. Two contain values for the bot to add to the infobox. The third is for navigation, an address for the bot to find the correct WP article to edit.

Action

The bot should add params "ELP" and "ELPname" to the infobox, using the values in the columns 'ELP URL' and 'ELP name' in the data table.

The value in the column 'ISO code' is to verify that the bot is editing the correct WP article. The bot should follow the WP redirect for that ISO code and verify that the ISO code does indeed occur in the infobox on the target page.

Example

For example, say one of the entries in the data table has the ISO code [abc]. The WP redirect for that code is ISO 639:abc. That should take the bot to a language article, and the bot should verify that the infobox on that article does indeed have a param ISO3 = abc or lc[n] = abc (where [n] is a digit).

If there isn't a match (and it's been years since we've run a maintenance bot to verify them all), then that ELP entry should be tagged as having a bad WP redirect for the ISO.

Complications

There is sometimes more than one ISO code per language infobox, because we don't have separate articles for every ISO code. (This is where the params lc[n] come in.) If the bot finds that there's already an ELP link in the box from a previous pass, then it should add the new codes as ELP[n] and ELPname[n], and keep a list so we can later code the template to support the article with the largest number [n] of links.

There is occasionally more than one infobox in a WP language article. It would probably be easiest if I did such cases by hand, since there are probably very few of them (if any), unless the bot can determine which infobox on the page contains the desired ISO code.

The bot should test that the external URL lands on an actual page. For instance, a language in the data table is listed as having URL 8117, but following 8117 gets the error message "Page not found :(". Such bad URLs should be tagged both for this project and for submission to the ELP.

ELP entries with multiple ISO codes (optional)

If the programmer of the bot wishes to, it would be nice if they could do a run for the 40+ ELP entries that each have 2 ISO codes. (Or have three, if the coding is easy enough, but there are only 16 of those. Anything more than that I should probably do by hand.) If the rd's for those two ISO codes both link to the same Wikipedia article, then the ELP params should be added as above. If they link to different articles, they should be tagged and I'll do them by hand.

Please ping me if you respond. — kwami (talk) 11:14, 16 January 2021 (UTC)

Consider adding this info to Wikidata instead and then pulling it from there. There are multiple systems there that can probably make this a fairly quick job. --Izno (talk) 08:31, 27 January 2021 (UTC)
@Kwamikagami: To elaborate a bit more on Izno's comment: most of the unambiguous ELP IDs are already in Wikidata. I'm currrently in the process of importing the corresponding ELP names. The IDs are on the statements as endangeredlanguages.com ID (P2192) and the names as subject named as (P1810) qualifiers. The next step would be to modify {{Infobox language}} to use these, but I'm not familiar enough with Module:WikidataIB to do this myself. Vahurzpu (talk) 19:44, 27 January 2021 (UTC)
We can summon RexxS and he will appear as if by magic to fix all things. --Izno (talk) 20:00, 27 January 2021 (UTC)
I'm not familiar with Wikidata or how to access it through an info box. Probably not a bad thing to learn. It would be nice to have a central repository to make updates easier. BTW, I got the list from ELP. Some of the URLs haven't been created yet. That includes all of the higher numbers and a few scattered lower numbers. I think I've weeded those out, though. — kwami (talk) 21:57, 27 January 2021 (UTC)
@Kwamikagami and Vahurzpu: I'm always happy to help anyone learn, and I can give you an example of fetching the data you want from Wikidata, if you'd like.
You can get the value of endangeredlanguages.com ID (P2192) from Dâw (Q3042278) like this:
{{#invoke:WikidataIB |getValue |ps=1 |P2192 |qid=Q3042278}}2547
and the qualifier subject named as (P1810) like this:
{{#invoke:WikidataIB |getValue |ps=1 |P2192 |qid=Q3042278 |qual=P1810 |qo=y}}Dâw
You would normally place those calls in the infobox definition, but that will become complicated if there are multiple values for the language's ELP identifier. I can't find one right now. Are there any? If so, I'll write a custom function call for you tomorrow, when I've found an article to test it on.
Otherwise, I have modified Template:Infobox language/sandbox to show you how it would work in the Dâw language infobox. See if that does what you want and let me know. --RexxS (talk) 01:09, 28 January 2021 (UTC)
That looks good, thanks. But what decides whether an ELP code appears, and which one appears? (Not counting the manual override.) — kwami (talk) 07:06, 28 January 2021 (UTC)
@Kwamikagami: Whether an ELP code appears or not depends on whether or not it has the requisite data on its associated Wikidata item (these are linked on the sidebar; for an example, see Dâw (Q3042278)). About 2900 pages currently have ELP IDs on their Wikidata items, and assuming none of that has changed in the last 12 hours, all those have names.
In the case where there are multiple ELP IDs for a single page: it isn't handled cleanly (to see exactly what it looks like, go to Bonan language, switch {{Infobox language}} to {{Infobox language/sandbox}} and preview). However, there are only 7 pages where this would apply currently, and those probably need a manual override anyway. Vahurzpu (talk) 07:50, 28 January 2021 (UTC)

Sorry, I didn't follow any of that. I don't see any of the data at Wikidata. E.g., I can't tell which are the 7 pages with multiple IDs, or how it was determined which page gets which ELP ID. — kwami (talk) 08:13, 28 January 2021 (UTC)

@Vahurzpu: as promised, I've made a custom module Module:Endangered Languages Project to deal with fetching the ELP data. It handles multiple values and allows a local value to override the Wikidata value. If you now look at Bonan language, you'll see the format I've used for multiple ELP values. Let me know if you want something different.
@Kwamikagami: You don't need to know how many ELP values are available in the Wikidata entry as the code now takes care of that. If you update Template:Infobox language from its sandbox, every article which already has ELP and ELPname parameters will remain unchanged, and every article that doesn't have those parameters set will try to fetch them from the corresponding Wikidata entry and use those. Please let me know if you need more explanation. --RexxS (talk) 13:30, 28 January 2021 (UTC)

Thanks, @RexxS:! That looks great!

Where would we go to update the ELP values?

Could you generate a list of ELP ID's with single ISO codes that are not being triggered, so I could fix them manually? I've noticed severl, but would rather not search all 3000 to check.

Could you add a name to the refs so we could call them with <ref name=ELP/>, <ref name=ELP2/>? And could you add a link to Category:Language articles with manual ELP links for articles that have a value in ELP? (I've done it for ELP2 in the template.)

A slight hiccup, when ELP is entered manually without ELPname, nothing displays. Something should show, if only to alert editors that the infobox needs to be fixed.

BTW, see Yauyos–Chincha Quechua, where there is a second, partial match. (The only ELP article said to be a subset match to an ISO code.) I used ELP2 to add that to the automated link.

Gelao language has up to ELP4. — kwami (talk) 22:08, 28 January 2021 (UTC)

@Kwamikagami: I think we're talking at cross purposes. Izno and Vahurzpu suggested using Wikidata to store the ELP code and ELP name, and I created a way to fetch the information from Wikidata. You seem to want to add the information manually to each article, or have a bot do that for you. Either way will work, but obviously not both at the same time. Personally, I'd recommend storing the ELP identifiers on Wikidata because that makes them available to all 300+ language Wikipedias, but you may prefer not to. If there is a list of these ELP identifiers, then a bot could add them to Wikidata, although you'd need someone on Wikidata to do the request for you.
I've just added the four ELP values for Gelao language to Gelao (Q56401) on Wikidata and removed the manual parameters from the article. As you can see, the information is now fetched from Wikidata. I've added code to generate a name for each reference, ELP1, ELP2, etc.
Previously, when ELPname was added without ELP, nothing was displayed. I coded the module so that there is nothing displayed in either case, which is preferable for readers. But I understand that you need something for editors to see where problems may occur, so I've amended it to show the parameter (unlinked) and added a tracking category Category:Language articles with missing ELP parameters to catch cases where one ELP parameter is missing and can't be supplied from Wikidata. --RexxS (talk) 15:03, 29 January 2021 (UTC)

@RexxS: Actually, I do prefer Wikidata, but I didn't know how & where to go about modifying it.

I think there will still be some need to augment it manually, though. In other language WP's, they may decide to follow ISO divisions where we do not, or have other differences in scope that would not be appropriate in WD. So, unless there's a work-around (I'm not familiar with WD), we should probably have the universal elements in WD for every WP to access, and then manual overrides when some particular WP wishes to diverge from that, for whichever reason. (E.g. deciding that ISO or ELP is inaccurate, based on the sources used for an article.) Wouldn't putting everything in Wikidata cause conflicts between different-language WPs?

Also, how can we generate a list of the ELP ID's that are called in WP-en, so I can fix the ones that aren't? — kwami (talk) 01:17, 30 January 2021 (UTC)

Hi Kwamikagami: I'm no Wikidata expert, but I hope this query may help (use the blue run button). It should show you a table of Wikidata item IDs, ELP IDs, ELP names, and enwiki article titles where there is a connection. (There are multiple rows for cases where an item has multiple ELP IDs.) — The Earwig talk 02:58, 30 January 2021 (UTC)
Thanks, Earwig!
And I must say, that's a user name I'm not going to forget soon! :-) — kwami (talk) 03:11, 30 January 2021 (UTC)

That reduces it to about 500 articles I need to check by hand or manually add to WikiData. — kwami (talk) 09:29, 2 February 2021 (UTC)

@RexxS, Vahurzpu, and The Earwig: In the bottom section at Wikipedia talk:WikiProject Languages/List of ELP language names (#Names in the 'Languages with single ISO codes' ...) are the 500+ ELP name that should be linked from WP articles but aren't. Sometimes that's because the WP article covers more than one ELP language, but other times I don't see why there's no link. Maybe just a mismatch in names?

Would it be possible to add those ELP names & links to the WP articles through Wikidata? (To the WP articles that those blue-linked ELP names redirect to.) I've done a few manually, and can revert that once they're in Wikidata. — kwami (talk) 04:20, 3 February 2021 (UTC)

kwami: I looked through a few examples on that page and in many cases it's not obvious what to do. For Nenets, your list gives an ELP ID of 5847, which points to an invalid page on ELP's website, so I don't think this should be added. For Tujia, the article covers both Northern and Southern dialects, but there are separate Wikidata items for Northern Tujia (Q12953229) and Southern Tujia (Q12633994), and the ELP IDs are (rightfully) located on those items instead of the general Tujia item. For this general problem, we may consider borrowing an approach used by {{Taxonbar}}. Rather than manually add the ELP IDs for the dialects to the main language infobox, we add the Wikidata items of the constituent dialects and the template automatically pulls the ELP IDs from those items instead of the page's item. That is, rather than adding |ELP=4225|ELP2=1744 to the template, we add |from=Q12953229|from2=Q12633994, and the template pulls the ELP IDs from there. This has an advantage of making it easy to maintain other dialect identifiers if we choose to move more identifiers to Wikidata. I am not sold on this approach, but wanted to propose it. — The Earwig ⟨talk⟩ 07:17, 14 February 2021 (UTC)
Thanks, Earwig. Is there a way to automate that? — kwami (talk) 07:33, 14 February 2021 (UTC)
It's possible to automate either that or adding the ELP IDs directly. I looked around and I don't see any infoboxes doing what I described, so it might be too esoteric of a proposal. I'd appreciate input from someone more familiar with infobox design. The situation we have is rather hairy, with many ELP IDs invalid or attached to Wikidata items that are different from the articles that discuss them. — The Earwig ⟨talk⟩ 00:30, 15 February 2021 (UTC)

A bot for adding missing date format tags

There appear to be many many thousands of articles that are missing {{use mdy dates}} or {{use dmy dates}}, but which are linked to information that is sufficient to determine which tag should be used. For instance, I think we can safely assume that an untagged page for a high school in a subcategory of Category:High schools in the United States (or with country (P17) = United States (Q30) on Wikidata) ought to be using MDY, or that an untagged British biography page not in any categories for expatriates or dual nationality ought to be using DMY. The 3500 pages that use {{American English}} but have no tag seem like an even easier call.

I'd like to see a bot that goes through old pages and adds the appropriate tags where it can make a firm determination. It would then operate periodically to add DMY or MDY tags to new pages as they are created (but would not override any pages tagged manually). This would help reduce the incidence of the ugly 2021-02-15 dates, and save some amount of editor work. It would be very low-risk, as even if there's some unforeseen circumstance that causes the bot to occasionally mess up, there's very little damage done (e.g. Americans can still understand DMY fine, likewise for Brits with MDY, and most would probably prefer either to YYYY-MM-DD) and correction would be easy.

Does anyone want to take this on? {{u|Sdkb}}talk 23:27, 15 February 2021 (UTC)

@Sdkb: How would a bot be able to determine when the current format (even if it's "the ugly 2021-02-15 dates") should be retained per MOS:DATERET?) GoingBatty (talk) 04:11, 16 February 2021 (UTC)
GoingBatty, MOS:DATERET carves out an exception for switches based on strong national ties to the topic, which would be the case for the categories here. {{u|Sdkb}}talk 05:43, 16 February 2021 (UTC)

Bot to update from imdb

Request for bot called "BCuzwhynot bot"

Bot for 5 month notices to draft page creators

Maintain a list of articles that are possibly wrongly marked stub

Make section headings unique in year articles

Double bolding

User:R to section bot

Minor fix in Persian village articles

Source citations update

Changing talk page project templates

Add Template:Documentation to templates

Auto add bare URL inline template behind bare URL

Cleveland Clinic Journal of Medicine

Need categories and subcategories renamed en masse

Accept pending changes made by autoconfirmed users where they should have been automatically accepted

Replace template-space transclusions of Template:Doc with Template:Documentation

BOT

Bug fix and article review

Bulk XfD query

Clean up Infobox music genre templates

noindexing (or even deleting) old WT:WPSPAM reports

A WP:LAYOUT bot

Reassign DRN Clerk Bot Task

Bulk XfD request

Bot for Top 25 report

Add Archive URLs for Showbuzz Daily refs

Category sorting for Thai names

Reverse DNS lookup bot

mass linking and reffering

Bot for Challenges projects

Replace Template:Tbullet with Template: Demo inline

Replace Template:Tbullet with Template: Demo inline

Merging GA template into article history

Replace User:PumpkinSky's signatures

Please replace 265–420 with 266–420 in the following articles

Extracting data from a list

List of articles with multiple problem templates

Creating a list of userpages that have been edited by the editor once

Using tagbot to mark orphaned page

Cleanup of chronological data in NFL Draft tables

WP:OVERCITE warning for draft articles submitted

Help needed

Anjana Chaudhari

Fixing proper name for The New York Times...

Orphaned User Talk pages

Creating a list of all pages that use the "net worth" parameter in the infobox

Remove Template:Possibly empty category

New pictures in a Commons category

Unknown Infobox parameters

Legally mandated deletion of all online news content by German state broadcasters

FfD notices on article talk pages

Automatically setup archiving

Is there a bot to remove a nonexistent category from many articles?

Bot for welcoming new users

ISBN hyphenation

Bilateral relations SD's

Urgent request from Wikimedia Foundation Human Rights Lead

Bot for welcoming.

Extra line breaks at the end of sections

Gap in RPP archive

Automatic lists of images in rejected or deleted drafts

Request to revert mass message delivery (Guild of Copy Editors)

A bot to remove NA classes

Film infobox order change

Remove stale "references needed" tag

Sister project templates and Wikidata synchronization

Categorization/listification for biographies of mononymous people

Selective spelling change in court infobox template

Adding url template to bare infobox websites

Looking for a bot (crosspost)

Redlinked biographies that are potentially notable?

AINews.com ownership change, now ainews.xxx

Automatically create ending with“Archipelago”redirects for ending with“islands”class entries

reflist talk

Post-Move Bot

Contribution Project

Related Articles

Wikiwand AI