Intelligent lab digitalization, part 1: a lesson from history
In this first post in a 2-part series, we explore how factory electrification was slow to be adopted in the Second Industrial Revolution and how the initial failures in implementation can serve as a warning to companies planning their lab digitalization as part of Industry 4.0.
The Age of Steam
Throughout the Nineteenth Century, steam was the literal powerhouse behind the burgeoning mass production processes that underpinned that era’s dramatic increase in manufacturing output. The typical factory of the period was effectively one vast machine, a single complex structure organized around a giant, coal-fed steam turbine that operated non-stop to drive a continuous series of line shafts, belts and gears that spanned every room and floor. Everything about these buildings that tightly packed together equipment and workers was designed to accommodate the massive task of transmitting mechanical energy from a single power source to dozens of specialized machines.
This manufacturing paradigm represented the tried and tested status quo at the turn of the Twentieth Century and the emergence of electrified systems did little to convince factory owners to alter their approach. By 1881, Thomas Edison’s AC electrical generation plants had been unveiled in New York and London and the new technology was ready to fuel the rapid expansion of communications, planning, urbanization, and globalization that would become the hallmarks of the Second Industrial Revolution. Factory electrification, however, only emerged in the final years of that Revolution and didn’t achieve ubiquity until much later.
The perceived limitations of electrification
Despite conducting more and more business via telegram (and, later, telephone) and despite having installed electric lighting in their administrative buildings, at the start of the Twentieth Century most manufacturing companies were highly resistant to electrifying their factories. There were, of course, innovators and pioneers who did see the potential for implementing electrification in their manufacturing plants but they remained a minority and, by 1900, only 5% of US factories were electrified. Significantly, a sizeable portion of these early adopters actually failed to achieve any notable increase in productivity from making the switch, further shaking confidence in the new technology.
But why did this happen? Why was a fresh wave of industrial progress not being fueled by the magic of electricity? What would need to change before the paradigm fully shifted? And what does any of this have to do with lab digitalization?! The answer to all these questions can be summarized in one word: implementation.
The very first generation of factories had been powered by waterwheels that drove shafts, belts and gears to deliver mechanical energy to the looms and presses being operated by workers. Once reliable steam engines became available in the early Nineteenth Century, their significant increase in power output allowed the same basic concept to be repeated but on a much larger scale. Seventy years later, it was the turn of electric motors to leapfrog into pole position for power generation. Companies that had heavily invested in factory infrastructure that was optimized for steam immediately saw the potential for benefits from electrification: no more need to operate a 24/7 coal furnace, no need for a constant supply of water—an electric motor could be installed not only to replace the steam turbine but also all of the labor and materials associated with maintaining it. And this was the critical implementation error that failed to produce a return on their investment in electrification.
Playing to strengths
Steam engines are big and mechanical energy transfer is highly inefficient—every gear, every bearing, each oil-lubricated rotation of a line shaft loses energy to heat and noise. Electrical motors, on the other hand, could be big, they could be built to generate the same output and power the same infrastructure, however, the end result would not be significantly different—the method of transmitting power to the machinery remained inefficient. But a far less wasteful means of power transfer was already being used to deliver the electricity to the factory: copper cables. By instead distributing electricity throughout the site by wire, directly to each room—even to individual workbenches—smaller and more efficient motors could be used to directly power each piece of equipment independently. This is an example of intelligent implementation: not simply replacing old technology with new but fundamentally shifting the prevailing mindset, optimizing peripheral processes to better take advantage of the new technology’s strengths.
And this is where laboratory digitalization comes in—but more on that in Part 2!
Get in touch today!
Say hiCheck out some related blog posts!
-
Find out how a multinational company who already used Binocs in QC labs also digitalized their biopharma process development labs with BinocsRead more
-
“Industry 5.0” explained in 800 words
We’ve barely started to understand the scope of the Fourth Industrial Revolution and some institutions are already heralding the dawn of the Fifth! So, if your head is spinning…Read more -
5 qualities that make a great laboratory planner
During my career as head of an analytical lab, I was fortunate to oversee the work of some top-class lab planners. Back then they were planning their lab schedules…Read more
Adam Lester-George
Adam has two decades of experience working in clinical trials, biomedical research, public health, and health economics, with a particular interest in the intersection between technology and life sciences. For 7 years before joining Bluecrux in 2019, Adam was the director of healthcare innovation consultancy “LeLan” and brings a wide range of insights to his role as Content Specialist for Binocs.