The enthusiasts should warn against artificial intelligence because the expectations that this technique will suddenly rise to productivity remind us of similar promises that followed the introduction of computers in the work environment. At the time, it was said that these strange new machines would be fought as large office work and lead the world to a graceful digital economy. This tone comes back after six decades. With the launch of “Chat BT” in 2022, researchers from the Massachusetts Institute of Technology said that employees who use artificial intelligence would become 40% more productive compared to than others. But these allegations may not withstand long, just as dreamy expectations were not held in the 1960s. An in -depth study published by the National Office for Economic Research in May showed that the contribution of artificial intelligence in the saving time was not 3%. Other studies have revealed that dependence in advanced spiritual tasks can lead to a decrease in the stimulation of employees and impede their work. Computers initially did not improve productivity, today we see a new chapter of ‘Productivity Paradox’, a term formulated to describe the unexpected recession in productivity, and even in the first four decades of the era of information. The bright side is that the previous experience today can help us improve our expectations. Since the 1960s, innovations such as transistor, integrated circles, memory flakes and fine treatments have contributed to qualitative jump in information technology, as the ability of computers has regularly doubled every two years without increasing costs. Also read: Expect a major difference in the impact of artificial intelligence on the economy in 1964, the Wall Street Journal described computers as ‘electronic wizards’ who took over the implementation of a ‘large number of tasks’ within offices at a speed that exceeds an army of an army of administrative employees at a lower cost. Later in the same year, the newspaper reported that the distribution of computers led to a “sharp slowdown in the service of administrative employees.” She quickly prevailed that computers would pave the way for widespread automation and structural unemployment. As one employee, it is equipped with a computer to perform tasks that hundreds of workers need. Over the next three decades, the computer services sector has adopted a full rush. But the promised profits are not fulfilled. On the contrary, studies in the late 1980s showed that the service sector, which economist Stephen Roth described as “the most possessed of advanced technical capital”, was in fact one of the worst achievements in productivity. In a famous satirical remark, economist Robert Solo said: “We see computers everywhere … except for productivity statistics.” Electricity also led to a similar result, and the economists presented several statements for this puzzle known as ‘solo’s paradox’. Perhaps the worst of these interpretations, and if present today, is the claim that what has happened is nothing more than an illusion caused by a failure in the measurement, and that the effects of extensive automation simply did not appear in economic data. On the other hand, some write the failure of investment in information technology in achieving the promised achievements to managers. In this opinion, something is right, as studies on the adoption of information technology show that corporate officials have spent generously on new technical equipment, with the employment of cadres receiving high wages to maintain and modernize these systems. Instead of reducing the number of employees, computers have led to an enlarged workforce. Also read: Critics of artificial intelligence in Alpabett ask the wrong questions, but the most convincing interpretation comes from economist Paul A. David, who offered the such “temporary delay hypothesis”. In his opinion, technical transformations increase severe conflicts, organizational battles and market share competitions. Meanwhile, the old systems still exist with Al -Jadida. None of this is reflected in the form of immediate productive profits, but the contrary is true. David was tortured in his support for the “temporary delay” hypothesis at the stage of the electricity entrance to the industrial sector, as it is a faster and more efficient source of energy that was the source of energy. However, despite this progress, it has taken about 40 years to translate electricity into a real increase in the productivity of workers, amid conflicts on industrial standards, successive mergers, organizational battles and the need to redesign each factory. It was a slow, expensive and chaotic process. The exclusion of an immediate improvement in productivity has been repeated with computers. In 1966, only two years ago since The Wall Street Journal, he celebrated computers, and published a report indicating that these devices with great capabilities in storing and recycling data began to drown the drivers with excessive details. At the time, the newspaper estimated that ‘approximately 25,000 computer computers produce 7300 miles of paper daily’, which reflects in today’s terms to pour my information with companies. These complaints did not disappear, but in the late 1990s the remarkable improvement in the productivity of the US economy recently began. Some economists attribute the reason to the extensive acceptance of information technology, even if it would come. But this momentum did not last long, as efficiency fell quickly again, despite the spread of the Internet and all other innovations that appeared during that period. Artificial intelligence is not an exception. The new technology will have unexpected consequences, which can limit the expected efficiency, and even completely undermine it. But that does not mean that artificial intelligence is useless, or that it will prevent companies from taking it enthusiastically. But those who hope overnight in a breakthrough in productivity will find himself before a great disappointment.
Will it increase productive artificial intelligence immediately?
