AI has a problem of using greater amounts of data and computing power and relying on centralized cloud services to build more powerful algorithms. This not only creates insane amounts of carbon emissions but also limits the speed and privacy of AI applications. However, tiny AI is changing that.
While tech giants and academic researchers are working on new algorithms to minimize existing deep-learning models without losing their capabilities, a generation of specialized AI chips packing more computational power into tighter physical spaces is emerging. These specialized chips can train and run AI on far less energy.
These advances are just starting to become available to consumers. For example, last May, Google announced that it can now run Google Assistant on users’ phones without sending requests to a remote server. Another example is apply running Siri’s speech recognition capabilities and its QuickType keyboard locally on the iPhone. Other companies like IBM and Amazon now also offer developer platforms for making and deploying tiny AI.
There is a variety of benefits from these services. Without having to ping the cloud every time they need access to a deep-learning model, existing services like voice assistants, auto correct, and digital cameras will perform faster and become better. Tiny AI will also make new applications possible, like mobile-based medical-image analysis or self-driving cars with faster reaction times. Furthermore, localized AI is better for privacy since your data no longer needs to leave your device to improve a service or a feature.
But as the benefits of AI become more widespread, so will all its challenges. Making tiny AI can make it difficult to combat surveillance systems or deepfake videos. Deepfake videos are existing videos that are replaced by someone who looks similar. In addition, discriminatory algorithms could also proliferate. Researchers, engineers, and policymakers need to work together now to develop technical and policy checks on these potential harms. This may be a monumental feat, but researchers have a lot of work to secure these services.
Sources: