Talk:List of manual image annotation tools

From WikiProjectMed
Jump to navigation Jump to search

List selection criteria

In response to a still-open discussion at the Wikipedia:External links/Noticeboard about this page, an editor has just updated the cleanup tag at the top to say that only independent reliable sources are acceptable for determining whether an entry belongs in this list. Following that advice would eliminate the use of non-independent reliable sources, e.g., you couldn't use dataturks.com to verify that Dataturks is image labeling software, even (a) though the front page of that website says that it is, and (b) nobody would ever seriously contest the verifiability of that claim.

@Ryouchinsa, Biogeographist, Adutta.np, Manuaero, and Mohangupta13: you are the five most frequent editors of this list. What do you think the Wikipedia:List selection criteria for this page should be? There is no set rule about what the criteria should be for an article like this, so you get to make it up. (I'd recommend against the "must have an existing article" standard, however. That's meant for lists like "People from New York", which could otherwise have millions of entries.) WhatamIdoing (talk) 23:46, 28 August 2018 (UTC)[reply]

@Beetstra:, who offered feedback on the WP:ELN thread about this list and may be interested here as well. One more important point for other editors: editors with a possible "conflict of interest" are strongly discouraged to edit articles about their own products and companies. But of course such editors are welcome to post suggestions for changes here on the talkpage (see also WP:COI for more information). GermanJoe (talk) 00:36, 29 August 2018 (UTC)[reply]
@WhatamIdoing: my criterion is not that it needs an own article. It needs a reference, independent and reliable, to show it belongs in this list. That works for people in New York, it works for versionsof PacMan and for bikes with alternative wheel arrangements. Show it belongs in the list. --Dirk Beetstra T C 02:16, 29 August 2018 (UTC)[reply]
I'm not sure whether to say, "Of course we agree" or "Why do you sound upset?" I don't think that anyone said anything about what your views might be. I've personally recommended against one of the criteria that LSC mentions (in the LISTPEOPLE section; look for the paragraph that begins "In other cases, editors choose even more stringent requirements, such as already having an article written") and expressed no opinion on any of the other typical options, including whether a typical option might be suitable. (I think that the scope of the article is something that the regular editors of the article should decide.) AFAICT, nobody has claimed that you supported or opposed any option. WhatamIdoing (talk) 03:01, 29 August 2018 (UTC)[reply]
(sorry, I thought I was on ELN, and that you replied to my earlier remark there. My apologies --Dirk Beetstra T C 03:34, 29 August 2018 (UTC))[reply]
I added the cleanup template to this list. I watch a number of other software lists, and all of them attract WP:LINKSPAM. Wikipedia:List selection criteria, the page linked above by WhatamIdoing, says with regard to lists of companies and organizations: "If the company or organization does not have an existing article in Wikipedia, a citation to an independent, reliable source should be provided to establish its membership in the list's group." I imagine this guideline exists, in part, to prevent spam. Since most software is created by companies and organizations, and since lists of software tend to attract spam, this seems to be an important guideline for lists of software as well. In any case, it is my understanding that it's a Wikipedia policy that a Wikipedia page should not be a WP:LINKFARM, and per WP:ELLIST stand-alone lists "should not be composed of external links. These lists are primarily intended to provide direct information and internal navigation, not to be a directory of sites on the web." I don't have any additional suggestions for inclusion criteria beyond the Wikipedia policies and guidelines that I just cited. Biogeographist (talk) 02:35, 29 August 2018 (UTC)[reply]
This isn't really a list of "companies or organizations", though. It seems to me that the general criteria is more relevant, and that recommends only "reliable sources" and "inline citations" (not requiring that those reliable sources be exclusively WP:INDY reliable sources). WhatamIdoing (talk) 03:07, 29 August 2018 (UTC)[reply]
Some of the software in the list are also companies (the company and the software have the same name). I also explained above why I think the guideline about lists of companies and organizations is also relevant to lists of software. Even the guideline on reliable sources says that "journalistic and academic sources are preferable" to commercial websites: I imagine it is for the same reason why independent reliable sources are required for lists of companies and organizations. Requiring independent sources cuts out the WP:ADVERTs—including people who are just trying to drive traffic to their websites. Biogeographist (talk) 03:22, 29 August 2018 (UTC)[reply]
(edit conflict) Still we need some criterion beyond self published existence. When buying stuff online you compare using review sites - if a product is in one of those reviews it can be in the list, that satisfies that people actually care about the product. --Dirk Beetstra T C 03:34, 29 August 2018 (UTC)[reply]
Yes, as WP:SPS also says: "if the information in question is suitable for inclusion, someone else will probably have published it in independent reliable sources." Biogeographist (talk) 03:39, 29 August 2018 (UTC)[reply]
For this (type of) article(s) it is also often true that you can find some (notable) company using the product of non-notable companies ('the MediaWiki software is written in php' typeof statement). If no-one uses the product ... --Dirk Beetstra T C 03:52, 29 August 2018 (UTC)[reply]
Noting that if an SPS would say that about their company, it may also be sufficient. --Dirk Beetstra T C 03:54, 29 August 2018 (UTC)[reply]
Meeting the suggested reasonable requirements should be trivially easy for any noteworthy product that gained atleast some public coverage. The previous approach with self-published sources and unrestricted promotional COI-editing clearly did not work for such a list about a commercial topic. I agree with the suggested inclusion criteria from the added cleanup tag. These criteria are also in line with WP:CSC. GermanJoe (talk) 09:23, 29 August 2018 (UTC)[reply]

I'll try to find 10 minutes to neutralize this list so we can see what needs additional sources. --Dirk Beetstra T C 03:56, 29 August 2018 (UTC)[reply]

I have done some work on it:
1) moved the external link column to the right
2) added a wikilinked column as first column
3) added a 'References' column at the right. This should show that the item 'belongs' in the list, per above.
4) Removed all 'see here' and similar external links out of the prose
5) un-external-linked the 'custom licenses', which were just another copy of the same external link in most cases.
Most of the items are now redlinked (except one ..). Probably some could be linked to appropriate places. That still means that most of all would need some form of 'it belongs here'-showing reference. I would suggest that we give it some time, and then consider to remove anything that is not reasonably sourced as belonging in the list. --Dirk Beetstra T C 10:22, 29 August 2018 (UTC)[reply]
6) alpha-sorted the whole thing, and then cleaned up some of the worst PR speak and irrelevant company details. GermanJoe (talk) 10:41, 29 August 2018 (UTC)[reply]

Note that I am going to remove ALL items that do not have independent sourcing by the beginning of the next month (at which time we gave 1 month to provide references). That excludes blue-linked items, as they have independent references through their own article. I will move them to a table here for future reference so that sourcing efforts can continue here. --Dirk Beetstra T C 05:28, 17 September 2018 (UTC)[reply]

Suggested edit - add my company and product (Clay Sciences)

@Ryouchinsa, Biogeographist, Adutta.np, Manuaero, and Mohangupta13: I think my company & product (Clay Sciences) should be listed here. I previously added revision 859573609 but since learned better about independent sources requirements and conflicts of interest (thanks @Beetstra:!). Here are three independent sources about us. Would any of this page's editors consider editing and/or adding my revision? Happy to explain more about the product if needed for editing purposes.

thanks Arielbaz (talk) 20:01, 19 September 2018 (UTC)[reply]

Mask Editor - new non-notable application removed

I have removed the application once again, as the entry - likely added with a conflict of interest - did not include any independent reliable sources. The first source is a trivial GitHub listing of this new 1-developer project that was commited just 3 weeks ago. The second one is just an archived unreviewed paper, submitted 2.5 weeks ago by the developer and a few co-authors. Neither of these sources is independent to merit list inclusion - Wikipedia is no venue for self-promotion or to advertise new personal projects. GermanJoe (talk) 17:36, 3 October 2018 (UTC)[reply]

Possible conflict of interest for adding a (to be) published paper reference

Hi, I'm the main author of the "Annotation App". I'd like to add a reference to a peer-reviewed conference article explaining the functioning of this application. The article already has a DOI (not accessible yet though) and should be published soon, the conference happening in two weeks. Is this the kind of reference I (or someone) can add in the "References" column for this entry.

The reference:

Matthieu Pizenberg, Axel Carlier, Emmanuel Faure, and Vincent Charvillat. 2018. Web-Based Configurable Image Annotations. In 2018 ACM Multimedia Conference (MM ’18), October 22–26, 2018, Seoul, Republic of Korea. ACM, New York, NY, USA, 4 pages. https://doi.org/10.1145/3240508.3243656 Mattpiz (talk) 05:44, 4 October 2018 (UTC)[reply]

@Mattpiz: I am going to clean out all items that do not have a independent (secondary) reliable source attached to it (copying everything here). You can add the reference (it certainly helps in attribution, but since it is not independent it is not according Wikipedia's sourcing requirements), but likely the whole item is going to be removed soon (unless you have independent sourcing as well). --Dirk Beetstra T C 06:20, 4 October 2018 (UTC)[reply]
@Beetstra: Well, peer-reviewed scientific journals and conferences are reliable sources, often more reliable than articles on web journals. So it would be a shame to not take into account those (published by independent organisms, here ACM) as independently reliable sources.
@Mattpiz: yes, they are reliable, but not independent. Peer reviewing does not make material independent. Papers published in a journal are a primary source for the information in the article, and a secondary source for discussion of material in other articles. See Wikipedia:No_original_research#Primary,_secondary_and_tertiary_sources - you cannot assess whether a subject is worth mentioning only by using reports written by a subject, you can only assess that it exists.  See also WP:LISTCOMPANY. Lists like this need selection criteria, otherwise every hobby project appears in this list. If in 3 months you can show an article that refers critically to your article, then that article qualifies as a secondary source. --Dirk Beetstra T C 07:14, 4 October 2018 (UTC)[reply]
@Beetstra: Thank you for the clarification and links. I'm not sure it makes sense to follow a policy made for companies for applications but that's probably the closest choice from the examples listed on Stand-alone lists. It is however written in WP:SALLEAD that the list should begin with a lead section which "makes direct statements about the criteria by which members of the list were selected". I'd suggest adding a link to WP:LISTCOMPANY and a one sentence summary of the rules in the lead section of this page. Something like: "Applications in this list must have a valid Wikipedia page or a citation to an independent, reliable source.". Just as a remark, the external link [5] for OCLAVI doesn't have an author, and 11 out of all the links (12) in that external article directly link to the oclavi web site home page. It does not seem to be an independent reliable source.Mattpiz (talk) 05:00, 5 October 2018 (UTC)[reply]
@Mattpiz: I am indeed on the edge for Oclavi, the reference seems to bean aggregator site, which generally means a plain rewrite of company press releases (not an independent analysis). On the other hand, it appears that this is an independent use of the software.
Lets see what can be resqued from the section below. I do not believe that this type of subjects do not have any verifyable independent uses. --Dirk Beetstra T C 11:22, 5 October 2018 (UTC)[reply]

Removal of items from list that are not independently AND not reliable sourced

Below items have been removed from the list as they do not have independent (secondary) reliable sources. The only exception there is that the item has an own article showing independent notability.


Software Description Platform License References
Alp’s Labeling Tool Macro plugin to label images for Detectnet / KITTI dataset. Windows 10

Ubuntu 14.04

Custom License [citation needed]
AnnoStation A cross-platform client–server web application to annotate videos and images. Allows labeling in larger teams, flexible task management, roles and rights, annotation of all kinds of primitives, and flexible XML based configuration. Developed by Hella Aglaia Mobile Vision for development and validation of ADAS and machine learning algorithms. Javascript, HTML, CSS, PHP Custom License [citation needed]
Annotation App Aims at solving image annotation needs, in the simplest, most efficient form by minimizing the number of interactions, context switch, and visual overload. It has a light configurable user interface. It supports bounding boxes, polygons, points, strokes and free outlines. It is easily embedded in a micro-tasks service like Amazon Mechanical Turk. Web (Elm, JavaScript, HTML) MPL-2.0 [citation needed]
Annotation of Image Data by Assignments (AIDA) An web based annotation system that allows the definition of prescribed annotation tasks for a given study. Javascript, HTML, CSS, Groovy MIT License [citation needed]
Annotorious Annotorious is an Open Source image annotation toolkit written in JavaScript. JavaScript, HTML, CSS[1] MIT License [citation needed]
CVAT Computer Vision Annotation Tool (CVAT) is a web-based tool to annotate video and images for Computer Vision algorithms. CVAT includes: interpolation of bounding boxes between key frames, automatic annotation using TensorFlow OD API, shortcuts for most of critical actions, dashboard with a list of annotation tasks, LDAP and basic authorization, etc. UX and UI were optimized especially for computer vision tasks. Python (Django), Javascript, HTML, CSS MIT License [citation needed]
DataLoop DataLoop's platform offers manual and automatic image annotation capabilities. It supports image classification, creating bounding boxes, polygons and pixel level semantic segmentation (masks). The platform is also used to manage data (images), tasks and human workforce. Custom License [citation needed]
DataTurks DataTurks supports image classification, image bounding box, image segmentation, document annotation, NLP annotation etc. Provides real-time information on the dataset and the labeler performance. Data can be directly uploaded and downloaded in standard formats like PascalVoc, TensorFlow, Stanford NLP etc. JS, HTML Free for OpenData, otherwise custom license [citation needed]
Edgecase.ai A cloud based image annotation tool to label images for bounding box object detection with audit of workers. Manual workers can also be hired via edgecase.ai [citation needed]
FastAnnotationSingleObject This tool can be used to annotate large number of images (draw bounding boxes and assign label) in PASCAL VOC format very fast if the user has collected the image corpus in an organized manner and each image has only one object of interest. Main goal of this tool was to reduce manual annotation time.[citation needed] You can annotate around 6000 images per day (8 hours) using this tool.[citation needed] MATLAB GPL-3.0 [citation needed]
FastAnnotationTool A tool using OpenCV to annotate images for image classification, optical character reading, etc. C++[2] GNU GPL [citation needed]
gtmaker An image annotation tool for bounding box and contour. Windows, macOS MIT License [citation needed]
Image Annotator Plug-in for Drupal The module allows you to annotate images, and works in combination with field_collection. The module defines a new field called 'Image Annotator' that can be linked to an image field, so you can add multiple markers to an image. PHP, Drupal GNU GPL [citation needed]
Image Annotator Plug-in for Wordpress (Drupal) This is a plugin that uses the HTML5 canvas and FabricJS to allow you to add shapes and text on top of images and display those images. PHP, Javascript, Wordpress GNU GPL [citation needed]
Images Annotation Programme A web application to annotate a collection of images. Javascript, HTML, CSS, PHP MIT License [citation needed]
ImageTagger An online platform for collaborative image labeling. It allows bounding box, polygon, line and point annotations and includes user, image and annotation management, annotation verification and customizable export formats. Python (Django), Javascript, HTML, CSS MIT License [citation needed]
jsoda JavaScript web application for bounding box annotation for object detection. Javascript, HTML, CSS MIT License [citation needed]
LabelImg LabelImg is a graphical image annotation tool and label object bounding boxes in images. Python MIT License [citation needed]
LEAR Image Annotation Tool A tool that facilitates the annotation of objects in images with bounding boxes. C++ (with Qt library) GNU GPL [citation needed]
Microsoft VoTT A cross-platform tool to annotate videos and images. Supports computer-assisted object tracking with meanshift and direct deep learning framework integration with CNTK, Tensorflow, and YOLO to minimize manual annotation work. Javascript, HTML, CSS, Windows, macOS, Linux[3] MIT License [citation needed]
OCLAVI OCLAVI is an application to annotate images stored in Google Drive or Amazon S3 either alone or as a team, that can feed into their machine learning, artificial intelligence and NLP based models. OCLAVI Stands for Object(Image) Classification and Annotation for Computer Vision Models.

It supports all basic shapes like bound box, circle, polygon, point and cuboid. It has REST API Export and you can directly feed data to your model.

JS, HTML Custom License [4]
Paperworks A paper-and-marker based annotation system. Images are tiled into PDF pages, which are then to be printed, annotated with color markers, and scanned. The system then extracts the color channels from the scanned images and convert those to annotation masks. Python BSD 2-clause [citation needed]
Philosys Label Editor / Ground Truth Annotator Philosys Label Editor / Ground Truth Annotator allows full annotation of 2D images and 3D scenes represented point clouds with geometrical markers, properties and semantic segmentation. It is freely configurable by XML language and produces XML result data. It is used in automotive market to produce data for deep neural network machine learning and validation of algorithms for ADAS and Autonomous Driving. Windows7, Windows10 Custom License [citation needed]
Pixorize Pixorize is the browser-based image annotation platform where users can upload, annotate, and share images online. For images, Pixorize simplifies annotations for studying, giving feedback, explaining, and more. Javascript, HTML, CSS Custom License [citation needed]
Ratsnake Image Annotation Tool A tool for semantically-aware graphic annotation of images. Allows manual annotation through the use of polygons, splines and grids. It also incorporates a customizable Active Contour Model to enable semi-automatic segmentation of objects. Java Custom License [5]
RectLabel An image annotation tool to label images for bounding box object detection and segmentation. Mac OS X Custom License [citation needed]
Semantic Segmentation Editor A web based labeling editor dedicated to the creation of training data for machine learning. The tool has been developed in the context of autonomous driving research. It supports images (.jpg and .png files) and point clouds (.pcd files). JS, PaperJs, ThreeJS, React MIT License [citation needed]
Sequence.work Cloud based annotation system. It can provide its own workforce ("workers") or users can provide their own. Javascript, HTML, Node.js Custom license [citation needed]
SimAnno A web based image annotation tool with basic functionality. Angular 5, Python 3 Apache 2 License [citation needed]
Simple Image Annotator An image annotator with basic functionality. Python, Javascript, HTML, CSS MIT License [citation needed]
szoter Adobe Flash Custom License [citation needed]
VAST A utility application for manual annotation of large EM stacks. C++ [6]
Amira A software platform for 3D and 4D data visualization, processing, and analysis. It is being actively developed by Thermo Fisher Scientific in collaboration with the Zuse Institute Berlin (ZIB), and commercially distributed by Thermo Fisher Scientific. C++ Trialware [7]

References

  1. ^ "Annotorious Core Library Source Code". Retrieved 26 January 2017.
  2. ^ "FastAnnotationTool Source". Retrieved 26 January 2017.
  3. ^ "End to End Object Detection in a Box". Retrieved 17 November 2017.
  4. ^ "Shaping Tomorrow's Agriculture with Artificial Intelligence". MC.AI. Retrieved 2018-07-04.
  5. ^ Iakovidis, D. K.; Goudas, T.; Smailis, C.; Maglogiannis, I. (2014-01-27). "Ratsnake: A Versatile Image Annotation Tool with Application to Computer-Aided Diagnosis". The Scientific World Journal. 2014: 1–12. doi:10.1155/2014/286856. ISSN 2356-6140. PMC 3926425. PMID 24616617.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  6. ^ Daniel R. Berger, H. Sebastian Seung and Jeff W. Lichtman (2018-10-05). "VAST (Volume Annotation and Segmentation Tool): Efficient manual and semi-automatic labeling of large 3D image stacks". Front. Neural Circuits. {{cite journal}}: Cite journal requires |journal= (help)
  7. ^ "Amira for Life Sciences". Retrieved 2018-10-05.

Note that this is NOT about existence of the subject, this is about whether the world outside of the subject has noticed the subject. Sourcing can include material that independently uses the subject, or independent reviews or comparisons of single or multiple subjects. Blog posts (generally unreliable, often self published), self-published material (i.e. primary/not independent) and material independently published but written by the subjects (still primary, not independent from the subject) is not acceptable. --Dirk Beetstra T C 06:40, 4 October 2018 (UTC)[reply]


@Beetstra: nice cleanup. However I think some well-known tools were removed. For example, VATIC, VOTT, ViTBAT, Scalabel, BeaverDam are all referenced and compared in https://blog.claysciences.com/2018/a-comprehensive-guide-to-annotating-videos-for-machine-learning/ Arielbaz (talk) 16:23, 4 October 2018 (UTC)[reply]

@Arielbaz: that is a blog, and blogs are generally not considered reliable sources. There certainly must be more authorative material out there (and I mean that). --Dirk Beetstra T C 19:26, 4 October 2018 (UTC)[reply]

Dataturks

@Mohangupta13: we are asking for independent, reliable sources, not for 'externally validated references'. That is, what people who are NOT related to the subject write about the subject. That can be reviews , that can be organisations implementing the subject, etc. Blog posts are (generally) not reliable, anyone can post blogs.

And if those don't exist, as you seem to allude to in your edits, then it shouldn't be listed here. But again, I cannot imagine for this subject (automated image annotation is implemented in software many people have on tablets, it is just that these are manual) that there is no review describing a good number of them, that there are no external organisations writing about that they used one or two of them. --Dirk Beetstra T C 10:00, 17 October 2018 (UTC)[reply]


@Beetstra: Are the below news article independent, reliable sources? I am not editing the main article anymore due to the conflict hence asking you to take a look and make a choice.

https://inc42.com/buzz/meet-the-19-startups-selected-for-the-summer-2018-accelerator-cohort-of-axilor/

https://www.theindianwire.com/startups/axilors-summer-2018-accelerator-program-startups-62856/ — Preceding unsigned comment added by Mohangupta13 (talkcontribs) 10:45, 17 October 2018 (UTC)[reply]

@Mohangupta13: Those are not independent, you subscribed Dataturks to an accelerator fund, and you got selected (and as far as I can see, there does not need to be selection at all, it does not say how many companies subscribed. For all I know it were just 19 subscriptions). --Dirk Beetstra T C 11:01, 17 October 2018 (UTC)[reply]
@Beetstra: I am not sure what might meet your criteria, these are legitimate news websites, the same argument applies to almost any source, ex: "you got funding? I am not sure if you were the only company that applied", "You have stars on Github page, not sure you had 50 accounts created and gave stars yourself". I rest my case. — Preceding unsigned comment added by Mohangupta13 (talkcontribs)
@Mohangupta13: Indeed, so what, you had 50 stars. That may all be independent sources, but not reliable. You are sure that no-one else than employees of your company have ever used your software? --Dirk Beetstra T C 13:49, 17 October 2018 (UTC)[reply]
@Beetstra: Exactly my point, the items you have put on the list currently have added just a Techcrunch article and a Github page..are these reliable according to your above definition? (Though I find them perfectly reliable and I also find the sources I am pointing as reliable, but the question is more on what the moderator finds reliable). Reg who uses our software, we have 1000s of folks who use it and all their projects are open as well, which one can browse (here https://dataturks.com/projects/Trending) but I guess that too may not be reliable by above definitions.— Preceding unsigned comment added by Mohangupta (talkcontribs)
@Mohangupta13: 'The items you have put on the list' .. I have not put anything on this list, I have however deleted all that did not show any references that have at least an air of being independent and reliable. I do agree, the techcrunch that is left does seem rather thin, as if it is invited. Still, it is not a reason to add other spamcruft, it may be reason to remove it.
I find them reliable, but you have missed the point, as many other people that we had a similar discussion with on another page quite recent. The material needs to be reliable, and written by an independent person. Even if you yourself write an article on Dataturks and manage to get that published in a peer reviewed journal, published by a company, it remains a primary, and hence not independent, source. A secondary source is written by someone not involved with the subject that they write about. The material you show, the reports from independent people on your trending-page IS independent, it is secondary .. but that one is not reliable. The source needs to be independent (secondary) ánd reliable. Not OR, AND. --Dirk Beetstra T C 10:08, 18 October 2018 (UTC)[reply]
@Mohangupta13: forgot to ping. (P.S. you need to sign your posts, and can you please read the remarks on your own talkpage?) --Dirk Beetstra T C 10:10, 18 October 2018 (UTC)[reply]

VoTT

@Beetstra: to @Arielbaz:'s point As the developer of VoTT https://github.com/Microsoft/VoTT it would be good know exactly why it was removed and what the criteria are for this list VoTT is actively maintained, used, and open sourced under MIT Licence. It has features such as direct export and active learning that other tools listed do not and over 751+ stars on github and is included in the official Azure DSVM image https://blogs.msdn.microsoft.com/uk_faculty_connection/2017/09/29/microsoft-deep-learning-virtual-machine/ and . Other tools that were removed such as VATIC predate all the tools on this list. It's one thing to say that blogs are generally not considered reliable sources but the reference to VoTT came from the official Microsoft developer blog https://www.microsoft.com/developerblog/2017/04/10/end-end-object-detection-box/ and was later deleted. I would note that there are many tools on wikipedia such as tensorflow https://en.wikipedia.org/wiki/TensorFlow, pytorch https://en.wikipedia.org/wiki/PyTorch among others that also have references to corporate developer blogs. It seems that this purge was arbitrary. Annotorious for example wasn't removed even though it's reference is just a link to a repo, it's not actively maintained and it has less stars than VoTT. If possible it would be great to have all the actively maintained tools such as VATIC and Microsoft VOTT, returned to this list. I'm sorry if this comment isn't in the right place as I am a relative novice to wiki discussions.— Preceding unsigned comment added by 109.65.235.147 (talkcontribs)

I am sorry, as you predicted, the post was not in the right place and hence missed.
The point made is that the references have to be independent AND reliable. Both TensorFlow and PyTorch satisfy that. You will have to show that, next to any dependent sources and unreliable sources, there are sources that are both independent AND reliable that have written about the subject. For all that were removed, that was the criterion. And I do think that many of the removed items (including the ones that are not maintained anymore) can fulfill that criterion. --Dirk Beetstra T C 19:52, 16 November 2018 (UTC)[reply]

Thank you for the clarification. There are many references to VoTT but the come from either Microsoft directly or the users and their own developer blogs as listed in the previous edit. I have no doubt in my mind that VoTT will be referenced independently by a credible source in the near future if it hasn't been already as the project is gaining traction. I think it is difficult for many open source projects in niche domains to satisfy this criteria since the Wired, the Wall Street Journal, or Washington post don't really write about Image annotation tools and even a source like TechCrunch usually features these tools only as part of featured or sponsored content. It seems to me this practice also may encourages bad behavior such as the requiring citations for tool usage. In the meantime I will defer and respect your standards practices until the community independently cites our tool in a credible manner. — Preceding unsigned comment added by 109.65.235.147 (talk) 21:47, 25 November 2018 (UTC)[reply]

Added VoTT again, since it is an important tool and should be included in this list. I tried to provide meaningful references. WikiWriter123 (talk) 10:14, 16 May 2019 (UTC)[reply]

And I have removed it again. Those two blogs are both not independent, and having stars and likes on GitHub is just not enough. References to blogs are fine, IF there is also an independent reference. Either we follow a standard, or we accept the spamhole. --Dirk Beetstra T C 10:44, 16 May 2019 (UTC)[reply]

Supervisely removed

Per WP:CSC tools should only be added, if they are notable or likely notable. Notability on Wikipedia is established with in-depth coverage in independent reliable secondary sources (see WP:GNG). A passing mention in a dissertation doesn't meet this criterion. For example: a detailed review in a reputed computer magazine, or detailed coverage in an IT-related article or book would establish atleast some evidence for notability. GermanJoe (talk) 15:11, 28 November 2018 (UTC)[reply]

@GermanJoe: The links below contain references to Supervisely. All these sources are independent. Here is the proof that research papers and independent news articles are using Supervisely. I hope that some of them can be used as reliable references. Please, give your feedback.

https://arxiv.org/pdf/1712.05053.pdf - Paper “Pediatric Bone Age Assessment Using Deep Convolutional Neural Networks”. Joint work of researchers from MIT, University of Michigan, Lyft Inc., Neuromation. They used Supervisely as annotation platform for their research and mentioned this fact several times.

https://venturebeat.com/2018/11/28/project-fi-rebrands-as-google-fi-opens-to-iphones-and-most-android-phones/ - overview of AI Fluid Annotation technology from Google. Supervisely platform is mentioned as a competitive platform.

https://habr.com/company/newprolab/blog/352572/ - Comparison of various image annotation tools (including Supervisely) from New Profession Lab ( http://newprolab.com/en/ ) which develops Big Data related educational materials for corporate customers (in russian).

https://medium.com/@humansintheloop/the-best-image-annotation-platforms-for-computer-vision-an-honest-review-of-each-dac7f565fea - company, that provides image annotation services compares different platforms in blog post. Supervisely is described in special dedicated chapter.

https://webmonks.vision/services/manual-data-for-computer-vision/ - annotation service that uses supervisely platform (directly mention Supervisely) to provide lebeling services to their customers

https://blog.appliedai.com/synthetic-data/ - B2B AI platform (appliedAI) mention Supervisely as the platform for storing and managing synthetic data

https://dspace.cvut.cz/bitstream/handle/10467/76430/F3-DP-2018-Racinsky-Matej-diplomka.pdf - Czech Technical University in Prague, Master’s thesis. In dedicated chapter “Manual annotation services” author describes combination of Amazon Mechanical Turk (AMT) and Supervise.ly platform.

http://www.bmstu.ru/content/documents/s,bornik.pdf Bauman Moscow State Technical University, Russian National Student Converence, in collection of papers, one paper at page 257 compares functionality of LabelMe, Supervisely and NVIDIA DIGITS 5.

https://medium.com/datadriveninvestor/small-data-deep-learning-ai-a-data-reduction-framework-9772c7273992 - at this post Harsha Angeri (CEO of TRIBE Tech, previously Senior GM & Lead at Bosch India) describes one feature of Supervisely: AI powerd annotation for semantic segmentation.

https://www.slideshare.net/Jonathon_Wright/enterprise-augmented-intelligence-bridging-the-cognitive-gap-between-humans-ai at Digital Assured Conference, Jonathon Wright (https://www.linkedin.com/in/automation/ CTO and Co-Founder at Digital Assured, TEDx Speaker) at his presentation “Bridging the Cognitive GAP between Humans & AI” describes Supervisely platform at slide 20.

https://www.udemy.com/yolo-v3-robust-deep-learning-object-detection-in-1-hour/ company Augmented Startups is creating Latest Tech Courses in Augmented Reality, AI and Internet of things and have 36,369 students in total. They created special dedicated course on udemi.com that describes all features of Supervisely on real world example - How to build object detection system from scratch with Supervisely. Augmented Startups also have more than 30k subscribers (several videos about Supervisely platform were published on YouTube).— Preceding unsigned comment added by Borisovyy (talkcontribs) 18:15, 28 November 2018 (UTC)[reply]

As I have already explained, passing mentions are not sufficient to establish notability. So please do not suggest sources, that clearly do not meet WP:GNG requirements - that's just wasting time. Please check the link and read through the detailed description of these requirements. None of the checked sources seem "independent" and "secondary" and "reliable" (excluding most blogs) and "in some detail" at the same time. A few of the Russian ones may have a few details, but it's hard to judge without Russian-language knowledge. As mentioned in a short listing, the tool was published in 2017 (?). If this info is correct, it might simply be too soon and it would probably be better to wait a year or two for it to become better-covered in reliable secondary sources. If you don't want to wait, you could also try to follow the advice at WP:WTAF and WP:Your first article: create a short draft article with as many WP:GNG-compliant sources as possible (no blogs, mentions, etc.), and let an experienced reviewer evaluate and decide the notability of this topic. It may get accepted or rejected, but either way you'll get a clear review of the current situation. But please focus on the more-promising sources, and avoid filling such a draft with unsuitable references. GermanJoe (talk) 20:25, 28 November 2018 (UTC)[reply]
On 26th February I added Supervisely because I tested it myself as an independent user. GermanJoe removed my edition. Even if the tool is young, if you try it yourself you'll see it is more efficient than other listed tools. Supervisely was the first one I tried being successful on our data. See removed entry below.
Software Description Platform License References
Supervisely Supervise.ly[1] is an online platform that aims at reducting annotation time. It provides a wide variety of tools: Pixel-level markup, Large images handling, Various object types (rectangle, polygon, dot, line, skeleton), AI assisted annotation and Mechanical Turk integration for crowdsourcing annotation jobs. Nodejs, MongoDB[2], Tensorflow, Keras, Docker[3] Cloud, On-Premises [4]

References

@GermanJoe: what is more notable that a scientific publication citing the tool? Another paper citing supervise.ly: https://arxiv.org/pdf/1901.03814.pdf — Preceding unsigned comment added by Herven1618 (talkcontribs) 12:00, 26 February 2019 (UTC)[reply]
Please read the previous discussions as well as WP:GNG and WP:RS about the definition of independent reliable sources. Arguments like "I tested it myself as an independent user" and "you'll see it is more efficient than other listed tools" just distract from the main criterion to include something on Wikipedia: the existence of independent reliable sources with some significant coverage about a given topic. Whether a product is good or bad, well-known or not, such considerations are largely irrelevant. To briefly review your sources: refs #1-#3 are self-published PR sources, ref #4 is only an extremely short passing mention. The new ref #5 would be a start, but it only cites the tool as used tool in a study with almost no information about its functions and features. Maybe some of the non-English sources listed above may be more suitable to establish notability, but they would need checking by a native speaker. Again: blogs, promotional sources, self-published sources, and passing mentions are not sufficient. GermanJoe (talk) 14:13, 26 February 2019 (UTC)[reply]

Diffgram removed - insufficient source / conflict of interest

I have removed the recently added entry for Diffgram with this edit. According to some research the company was founded less than 6 months ago, and there is no evidence that the software (in beta) is notable or noteworthy yet. A pseudo review on a questionable blog like altcoinflow.com - or a commercial site like the original humansintheloop.org source - have zero credibility as a source (and a passing 1-sentence mention is not sufficient anyway). Also, it's pretty obvious that the editor has an undisclosed conflict of interest. Editing affected articles without disclosure and prior talkpage review of suggested substantial changes is a violation of WP:COI. GermanJoe (talk) 23:27, 30 January 2019 (UTC)[reply]

Suggested edit - Addition of "not compliant" softwares table

As an artificial intelligence user leading some deep learning projects in my company, annotation tools are at the heart of any projects. Some of the tools are really efficient and deserve to be known.

To my mind, the actual approved list misses a lot of well-known tools which have been clean up.

Deep learning and Artificial Intelligence in general, definitely are hot topics. We should provide a more dynamic listing, not waiting for two years that tools become ubiquitous (as suggested in the "Supervisely removed" paragraph).

Finally, I question the final listing and some previous removals or entry preservation. For instance, LabelBox seed round is from July 2018, LabelBox is not widely covered nor second sources are referenced nor it is ubiquitous. Still, it is in the "approved" list.

Why not providing two listings:

  1. Softwares with multiple secondary sources ~ Wikipedia:Notability#General_notability_guideline
  2. Other Softwares — Preceding unsigned comment added by Herven1618 (talkcontribs) 12:00, 26 February 2019 (UTC)[reply]

Herven1618 (talk) 18:02, 26 February 2019 (UTC)[reply]

Wikipedia content should be based on independent reliable sources, and if content - especially promotional content - is added without such sources, it will be removed sooner or later. That's a non-negotiable content guideline based on long-standing consensus. So a section "Other software" not based on such independent sources would be removed. Wikipedia is an encyclopedia, not a product directory or PR platform. GermanJoe (talk) 14:32, 26 February 2019 (UTC)[reply]
@GermanJoe: I do understand the promotional aspect of things. I’m not promising to add element without external source, at least one is the bare minimum. Also, your comment doesn't explain the presence of LabelBox compared to Supervise.ly. Herven1618 (talk) 18:02, 26 February 2019 (UTC)[reply]
LabelBox has a source with some detailed coverage, although Techcrunch isn't ideal for a number of reasons (see its entry with some advice for cautious usage at WP:RSP). If you think Labelbox doesn't belong in the list, you should start a discussion to suggest its removal (ideally in a separate thread, just to avoid confusion). This entry is a bit of a borderline case imo. GermanJoe (talk) 19:19, 26 February 2019 (UTC)[reply]
@Herven1618: The Techcrunch is just about the limit of how low to go, and it is thin. What articles like these need are independent review articles that discuss (or even compare) a sizable number of these. Articles written by members of the team, articles on invitation (and to me, the Techcrunch on LabelBox reads a bit like that) or the 'look, we have a webpage, so we exist and should be listed' is not the bar. An Arxiv is a pre-print, it is not necessarily peer reviewed. And the article has been published barely a month ago, and note that they do not use supervise.ly, they use their dataset.
This list originally contained 34 items ([1]). Totally indistinguishable between proper notable products and hobby products of a retired geek (or anything inbetween). We now keep this list minimal, and only include those that can show independent notability (i.e., there is a Wikipedia article for the subject), significant independent use (reasonably notable company B has implemented the product independently; independent peer reviewed papers using the subject because of specific needed features of the subject), or it is the subject of an independent review (preferably a review that compares or describes several). --Dirk Beetstra T C 06:42, 27 February 2019 (UTC)[reply]

TrainingData.io - insufficient sources

Two of the four sources are self-published and a GitHub entry, both sources do not establish notability. The third source, Lionbridge Technologies, is just a passing mention in a promotional blog/PR article. The last source with more details is a NVIDIA blog post. Generally - with a few exceptions - blogs are not considered reliable sources. But aside from reliability, the article is also not fully independent. Its author is "part of NVIDIA's corporate communications team" (per her author page). And TrainingData.io is a "member of the NVIDIA Inception virtual accelerator program" (per the author's own disclosure in the source article). In short, this article is an advertorial as part of the company's and NVIDIA's marketing efforts, but not an independent, reliable source. GermanJoe (talk) 11:01, 27 October 2019 (UTC)[reply]

Studierfenster - insufficient sources

The only source is self-published and the linked article also consists only of self-published sources. I encourage you to add the tool (seems quite cool) back to the list with sufficient third party sources. Floriv1999 (talk) 9:31, 15 October 2020 (UTC)