
Cryptocurrencies have been a hot topic as of late, because their adoption has threatened to turn the financial industry on its head. The true power of cryptocurrencies, though, is the underlying blockchain technology—and finance is just the tip of the iceberg when it comes to its implementation. The level of decentralization offered by blockchain technology has promised to disrupt a number of other industries outside of finance as well, including cybersecurity, voting—and now cloud computing.
In “The end of the cloud is coming,” a recently published Venture Beat article, Viktor Charypar says that we’re facing the end of the cloud. And I agree with him. Chasing the digital gold that is cryptocurrency, people around the world have acquired a huge amount of powerful computing resources for mining purposes. However, as more and more cryptocurrency is being mined, the activity is becoming less and less profitable. A quick glance at online forums reveals that miners who bought expensive GPUs in the last two to four years are already trying to sell them.
Selling the GPUs, though, is not the only answer for these individuals to earn a healthy return on their investments. In search of profits, IaaS “Uber drivers,” through platforms like SonM and Golem, have begun popping up, ready to rent their computing resources for less money than the major cloud infrastructure providers like Amazon and Microsoft.
This new model of leveraging resources from a decentralized network of computers is known as fog computing—and it’s offering to open the door for AI startups everywhere. Here’s why.
Fog computing is more cost-effective
For startups, controlling cost is of utmost priority. And for AI startups, this can be extremely difficult given the high level of computing resources required to run deep-learning computer algorithms. In many cases, computing resources are the highest cost in an AI startup’s budget.
Accordingly, fog computing offers these startups a cost-effective alternative to paying for additional resources from the cloud provider giants like Amazon and Microsoft. A rough approximation shows that costs for a fog computing infrastructure can be several times less costly than cloud-based solutions (AWS, Azure, etc.), even if the decentralized network of individuals were paid twice what they could get by mining Ethereum.
To illustrate a good example, I made a very rough comparison between AWS and the income from Ethereum mining. In that, case if customer pays $10 per month per camera for my service, I can pay $5 to miners. A miner who has a Nvidia 1080 TI GPU (11 Tflops) can earn about $75 per month or less by mining Ethereum; in February, it was just $43. At the same time, this GPU is able to process 500 faces per second. This is about 33 cameras per month with full load (15 frames per second, one face per frame). It means I will pay $166 per month to this miner. And I would pay about $544 per month for AWS with the same performance.
Despite potential cost savings, this switch shouldn’t come at the expense of quality. An AI startup must also consider that a decentralized infrastructure should meet certain criteria, including flexibility and scalability, a balance of centralized and decentralized elements, data protection, and a simple payment infrastructure.
Fog computing is flexible and scalable; it might be reliable too
For AI startups, flexibility and scalability are essential in moments when the demand on a SaaS solution suddenly increases. With my company's facial recognition software, this often happens if a client has a marketing campaign that brings a lot of customers to their shop or restaurant. It also happens at parties, concerts and sporting events where the number of faces to be recognized multiplies by ten, one hundred, or even one thousand times. This inflated use pushes technological infrastructures to their limits.
Fog computing, like cloud computing, offers a great degree of scalability for an AI startup’s operations. However, to ensure this, the decentralized network must have a software installed that allows each network participant to receive tasks and send results back. The software must also be able to detect the hardware and performance level installed at each node. In doing so, this allows a startup to appropriately build its IT infrastructure to meet its needs, just as it would under a traditional cloud computing model.
However, careful consideration must be taken when doing so. The beauty of fog computing is that nodes can be used from anywhere in the world, but the corresponding challenge is ensuring a high-quality broadband channel and connection speed. Often, free computing resources can be found somewhere in Africa or Latin America, but there is a high chance that they may encounter a bad connection. In theory, this becomes less of a risk as more nodes participate in the decentralized network, but at the moment AI startups should focus on high-performance nodes only to ensure an adequate level of flexibility, scalability, and reliability.
To further protect from service failures, AI startups have the option of using a hybrid distributed computing environment, part of which remains centralized. The decentralized portion can be used for resource-consuming tasks, while the centralized portion is used for orchestrating nodes to control the work of the distributed network and balance the load by connecting and disconnecting additional resources as necessary.
Fog computing offers required level of data and fraud protection
In addition, the centralized portion of the IT infrastructure could be used to manage sensitive data. A common concern of using cloud computing is that user data gets passed to a third party. Given that AI startups often manage highly sensitive data, this becomes an even bigger concern. Under fog computing, like cloud computing, customer data can first be obfuscated on the protected, trusted, and centralized environment, and then completely bypass the third party by relying on the decentralized network. This, in turn, enables a much higher level of data protection.
In addition, fog computing can offer increased protection against fraud. Another valid concern among AI startups may be that, under a fog computing model, members of the decentralized network could try to cheat the system by writing a simple script that allows them to continue mining cryptocurrency while appearing to perform computational tasks. However, there are ways to easily safeguard from this type of deception and detect untrustworthy nodes.
One example is to use a proof-of-work method whereby complex tasks are loaded to high-performance nodes, and fragments of the same tasks are sent to nodes with lower performance to check for accuracy of results. If the results from the two nodes do not match, they can be disconnected from the system and placed on a blacklist, banning future use. This ensures that members of the decentralized fog computing network only receive payment after the system has a certain amount of proof that the work is completed, and the results are real.
Currently, there is no easy payment and billing system that sits on top of the fog computing environments; however, extending payments to the decentralized network using cryptocurrencies is a likely solution that will be implemented in the next three years. Once all the minor kinks are ironed out, the myriad benefits offered by cloud computing will burst the door open for AI startups that implement resource-heavy tasks like machine learning and neural networks.
This article is published as part of the IDG Contributor Network. Want to Join?
Vladimir Tchernitski, CTO and co-founder of Faceter, is a computer vision expert with 4 years of experience of in the field of convolutional neural networks (CNN) and more than 25 years of experience in software development, a former head of the R&D department at Azoft.