BasicAI to Exhibit at International Conference on Computer Vision

Press Coverage→BasicAI to Exhibit at International Conference...

Irvine, California (October 24, 2019) - The International Conference on Computer Vision (ICCV) is a research conference sponsored by the Institute of Electrical and Electronics Engineers held every other year. It is considered, together with CVPR, one of the premier global conferences in the field of Computer Vision. This year, the conference will be held from Oct 29 to Nov 2, 2019 in Seoul, Korea.

Join BasicAI at Booth D-9 as we present our latest advances in annotation software tools. The BasicAI platform has been developed specifically to meet the needs of Data Science professionals across industries and use cases including robotics, autonomous driving, retail, financial services, and healthcare. The BasicAI platform is available to customers who utilize BasicAI labeling services, or as a stand-alone software license for organizations seeking to manage their own annotation workforce or who have data security needs that require on premises deployment.

With a dedicated global workforce across regions and time zones, BasicAI provides high-quality data labeling at scale. Our expert team of annotators, combined with provides robust quality control procedures, together deliver accuracy and timely results.

The BasicAI platform includes dozens of annotation tools to support a wide variety of industries including autonomous vehicles, industrial, robotics, health care, and financial services to name a few. Our annotation capabilities include semantic segmentation, text recognition, object tracking, audio classification, 3D point cloud object recognition, and more.

BasicAI is proud to present at ICCV as an exhibitor together with the world’s leading AI/ML companies and organizations. Please stop by our booth to learn how BasicAI is enabling our customers to improve the accuracy of their machine learning models by utilizing our efficient and cost-effective software and services.

To learn more, contact us at



Selina Liu