Written by Nigel Reaney (View Profile)
LMAC Founder and Senior Partner
08 August 2018
Industry 4.0 is the buzz phrase on the street. Leaders across many industries are talking about it. But is it all it’s cracked up to be?
There is no doubt that technologies are developing faster now than at any other time in recent history, but the fact that we are developing new technology does not always mean that we have to choose to use them.
For those that are not familiar with the term, Industry 4.0 refers to the onset of what many are calling the 4th Industrial Revolution.
- Historically the first stage of the Industrial Revolution (1770-1870), was centered on steam, water, iron and a major shift from agriculture as the main employment of the masses.
- The second stage of the Industrial Revolution (1870-1914), was focused on the introduction of new technologies. Electricity and the development of the petrol engine, oil. At this time steel manufacturing became cheaper and was used more widely.
The first two Industrial Revolutions, in general, made people richer and marked a clear shift to the populous becoming more urban.
The 3rd Industrial Revolution refers to the change from analog, mechanical, and electronic technology to digital technology that took place from the late 1950s to the early 1980s. This period also marked the start of the reduction in the reliance on fossil fuels and a change to the use of new energy sources such as Nuclear and hydro.
So now we come to the 4th Industrial Revolution. Originally the term “Industry 4.0” was coined by a German government high-tech strategy project. The original concepts were further refined and presented by The Industry 4.0 working group in 2013.
At its very core, Industry 4.0 centers around the transfer (or partial transfer) of some level of autonomy and autonomous decision making to cyber-physical systems and machines. In doing this, in theory, leveraging the self-learning abilities of the process to allow the process in question to continuously
High tech solutions such as AI (Artificial Intelligence), big data, increased automation, system integration, the Internet of Things (IoT) and many more are all combined to make processes quicker, more reliable and cheaper than our current human processes.
This all sounds very idealistic, but is this really the way forward?
Historically all the 3 previous Industrial Revolutions have come at some cost. The 1st and 2nd Industrial Revolutions, for example, saw changes to the way people lived. Many were forced to move from the countryside into urban areas to find work. This, in turn, put pressure on the towns and cities that simply did not have the infrastructure to support such a rapid influx. Regulations did not keep pace with change and industrial injuries increased and pollution spiraled out of control. The disease became rife, and this coupled with pollution saw life expectancy in the towns reduce.
The 3rd Industrial Revolution also saw shifts in the type of work people were undertaking. Many processes started to become less
Clearly, all 3 of the previous Industrial Revolutions have brought tremendous benefits in the long term. But there are lessons to be learned.
We should not blindly jump into Industry 4.0 without thinking it through. Are we simply looking to use new technology because it exists?
There is a human cost. We need to look at the juxtaposition.
We are not saying that using new technology is bad, far from it. But there needs to be a balance. Evolution rather than Revolution may just be the way forwards.