In response to the continuing demands of evolving AI, diverse hardware and software solutions are being developed. To process staggering amounts of data and to enable IoT device networks, cloud-based distributed architectures are being implemented. IT infrastructure is also evolving in diverse ways to achieve ease of access and respond to user’s changing needs.
Similar to other social infrastructure, IT infrastructure supports a society through which IT has permeated without people realizing it. An IT infrastructure consists of hardware that makes up the base structure of an IT system and the software that operates it. Specifically, it refers to hardware such as a server and network, and middleware such as an operating system and database.
In recent years, advances in cloud computing have made it easier for IT engineers to focus more on application development and less on IT infrastructure. In a cloud-based environment, both processing ability and performance can be increased or decreased on demand. All that is required to get started with cloud computing is to designate a set of devices and to select a service function. Cloud providers perform frequent maintenance automatically and continually implement improvements, including simplifying and streamlining operations as well as adopting new technologies to improve performance, security, costs and to respond to new requests immediately. There are now few reasons to avoid placing all services, environments, platforms and applications in the cloud unless special requirements exist, such as the need to completely control a device to ensure stable operation, or for an unusual network configuration due to extremely high security requirements. Furthermore, there is little need to be aware of the latest IT infrastructure functioning behind these robust cloud environments.
The importance of understanding both the current and future state of IT infrastructure, however, increases when considering the needs of new businesses. This is because the evolution of infrastructure has enabled ideas that used to be considered impossible. An example is the emergence of deep learning. At its inception, deep learning’s algorithm was expected to make AI evolve significantly, however, no IT infrastructure existed on which it could operate. The development of software that used a graphics processing unit (GPU), a super-parallel processor for multimedia applications, changed the situation. The emergence of this IT infrastructure enabled the deep learning algorithm to operate at a speed 100 times faster than traditional models, rapidly advancing research and development and proving the validity of the method. In fact, AI’s achievements often exceeded human ability, such as in image identification and voice recognition, making AI a source of competitive advantage in business.
Any discussion of the IT infrastructure evolution centers around the central processing unit (CPU), located at the core of the computer brain to produce computing ability. Since its birth, the CPU continued to increase in speed until about the year 2000, when improvements diminished due mainly to physical limitations. After that, the processor became the focal point for improvements. The technologies at the center of the next advancement were the multicore processor and off-loading using a dedicated processor.
The multicore processor is based on a technology that installs multiple cores, which are the brain of the computing ability, in one processor. More cores result in accelerated speed. CPUs with 28 cores have appeared, and so have GPUs with more than 5,000 cores. However, if computing is performed only by a specific core, granularity of calculation differs from core to core; parallelism decreases and the full potential of the technology cannot be achieved. To optimize the use of multiple cores, dedicated middleware divides the calculation to be run by the processor based on the number of cores. It was not until this type of dedicated middleware was developed that multicore GPUs became the center of machine learning. For this reason, future mainstream multicore processors will probably require innovations in middleware.
The off-loading technology separates tasks from the CPU that cannot be expected to get any faster, and transfers some of these to an external dedicated processor for a particular type of calculation. For example, transferring tasks to a processor specialized in AI learning enables speeds that are hundreds of times faster than processing by a single CPU. Off-loading is also used in the development of image processing for autonomous driving, AI processing, AI processing for smartphones and many other applications. Even for off-loading, middleware that appropriately delegates tasks to optimize the use of dedicated processors plays an important role. To make further advances, companies are disclosing their middleware and actively recruiting a larger pool of application developers.
Architectures have evolved that combine different IT infrastructure and offer them as a single service. This trend has particularly taken place with database technology for global distribution and integration and for blockchain technology.
It is revolutionary that the database technology that enabled global distribution and integration has reached the stage where it is offered on cloud platforms. In the past, geographically focused services offered in countries across the world would result in delayed responses and in discrepancies in recorded data due to physical distances. Thanks to the development of robust middleware, global distribution and integration is possible, where each data center provides positive user experiences in specific regions while using the same integrated set of data.
World-scale service platform developers used to build robust IT infrastructure on their own to offer social network services and emails exclusively for customers. Today, the time has come for all companies to migrate to the cloud for business. The promise of open-source development further raises expectations that IT vendors may emerge to offer architectures with an even greater degree of flexibility.
Blockchain, a technology that supports bitcoin, has solved a key issue related to distributed technologies, and is about to broaden its applications. A blockchain is distributed across the world instead of being centralized, and it ensures validity over a certain period of time, which slows processing speed. As a result, it is not considered appropriate for small payments and other accelerated transactions. To solve this problem, the development of off-chain technology is underway in which high-speed transactions are achieved by adding a high-speed peer-to-peer (P2P) network to the blockchain.
The development of cross-chain atomic swaps1 is also happening. This technology enables transactions directly between chains, avoiding the need for the presence of a third-party clearinghouse to exchange value between blockchains. However, bitcoin, which has already circulated huge amounts of cryptocurrency is a barrier to entry. Coordination of interests between companies and technologies might also present an obstacle to the ongoing evolution of this technology.
1 Cross-chain atomic swaps is the exchange of one cryptocurrency to another without the need of a third-party
Competition to develop breakthrough IT infrastructure over the next 10 to 20 years has initiated significant capital investments from major global companies. The competition centers on quantum computing, which applies mechanics to achieve parallel processing. In the future as technologies advance, it is expected to achieve overwhelming computing power. Even today, the quantum computer’s performance exceeds that of existing computers in processing combinatorial optimization problems. However, the reality of quantum computing is that it is more like an experimental device that uses today’s computer to control a computation core called quantum chips. It is uncertain whether quantum computers will outperform today’s computers in all types of computation. A more practical approach for the future might be for the computer to select a specific type of processing to off-load to the quantum computer.
If the ability of quantum computers, which are already 100 million times faster than today’s, can be applied to AI, for example, the dominant position of quantum computing will be overwhelming. Understanding the evolution of IT infrastructure and its impact on the future should not be overlooked.