Workflow Automation

As a kid, I was fascinated by television program called Beyond 2000. Among its myriad episodes, one stands out in my memory with particular fondness: the 1992 installment titled “The Future of Computers”. Within the first minute of this episode, T.J. Rodgers, a luminary figure from Cypress Semiconductor, imparted a statement that has resonated with me for over three decades: “Computers can do lousy jobs people don’t want to do”.

While this assertion may now appear somewhat commonplace, its profound implications were revolutionary over thirty years ago. Rodgers’ prescient words foreshadowed the emergence of an entire domain within engineering, one characterized by disruption and transformative change. It is within the context of this paradigm that I aim to reflect upon my journey as a systems engineer, elucidate upon my present accomplishments, and cast a contemplative gaze towards the horizon of possibility.

So what is Workflow Automation?

The process of automating repetitive tasks, actions, or processes. It involves using technology to streamline and optimize the flow of work, reducing the need for manual intervention and improving efficiency.

What are the Benefits?

  • Increased Efficiency: Automation reduces the time and effort required to complete tasks, allowing employees to focus on more strategic activities.
  • Improved Accuracy: Automated workflows minimize the risk of human error, leading to more accurate and consistent results.
  • Faster Turnaround Times: Tasks are completed more quickly, leading to faster turnaround times for projects and processes.
  • Cost Savings: By reducing the need for manual labor, workflow automation can lead to cost savings for organizations.
  • Enhanced Compliance: Automation ensures that processes are executed consistently according to predefined rules and regulations, improving compliance.
  • Enhanced Security: Automation offers real time monitoring, auditing, security protocols and risk reduction.

The Journey so Far

My initial foray into the realm of Information Technology commenced in a rather unconventional setting, within the confines of a coal mine spanning from 1997 to 2001. At this juncture, the internet had recently celebrated its fourth anniversary, coinciding with a pivotal era where substantial strides were being made in the automation of mining machinery. I distinctly recall being captivated by the spectacle of two imposing CRT screens within the wash plant, displaying animated operations of various large machinery. This technological marvel facilitated instantaneous alerts, enabling seamless control over machinery operations at the mere click of a mouse, thereby earning the epithet “the hot seat” for its pivotal role.

My academic pursuit subsequently transitioned to university, where I embarked upon a trajectory focused, this time, on electrical engineering from 2003 onwards. It was during this academic endeavor that I encountered a plethora of open-source automation tools such as Zabbix, Nagios, and, later, Observium. Linux emerged as a cornerstone within this domain, revered for its versatility and robustness, with shell scripting gradually assuming a prominent role in automation endeavors. Throughout my tenure as a field engineer over many years and tasked with the installation of hardware, Linux emerged as a recurrent fixture, underscoring its significance in modern industrial operations.

However, it wasn’t until 2007, upon achieving my first full-time post-university role, that I orchestrated the implementation of Linux to streamline reporting processes for company meetings under a prestigious point-of-sale contract with Caltex. This transformative initiative not only yielded substantial time savings but also furnished real- time reporting capabilities, epitomizing the power of automation in enhancing operational efficiency. Subsequent endeavors encompassed a spectrum of automation tasks ranging from stock control to procedure management and change controls, culminating in my ascension to the role of senior engineer entrusted with the leadership of a nationwide team comprising 30 individuals.

Over the course of my professional journey, I have traversed various roles, augmenting my repertoire of automation proficiencies encompassing telephony, email, monitoring, remote control, and infrastructure as code. In recent years, with the advent of dominant cloud platforms such as AWS, my focus has transcended geographical boundaries, pivoting towards global automation initiatives poised to redefine contemporary operational paradigms.

The Here and Now

In my current capacity as a System Engineer at Streamvision, I am confronted with a multifaceted and dynamic array of responsibilities. The rapid pace of technological adoption within our organization is nothing short of remarkable, with our leadership team actively fostering an environment conducive to innovation and the continual advancement towards optimal solutions.

On a weekly basis, I meticulously scrutinize workflows to pinpoint areas ripe for automation, thereby unlocking a multitude of benefits as previously delineated. Among my many endeavors, the most recent achievement stands out prominently, representing a pinnacle of professional pride. Presently, I am at the alpha testing phase of a groundbreaking initiative poised for imminent launch within weeks. Through the judicious application of virtualization technologies, Streamvision has attained unparalleled flexibility in its implementation strategies and testing protocols. We are on the precipice of ushering in a new era of infrastructure deployment, characterized by the global instantiation of infrastructure as code.

This transformative paradigm shift has not only expedites the launch of products on a global scale but has also democratizes the provisioning process, rendering it accessible to virtually any member of our staff. Consequently, our organization now boasts the capability to provision infrastructure not only within our local sphere but also across the expanse of our national and international footprint, thereby drastically reducing time-to-market and facilitating seamless scalability.

The Future

It is evident that the trajectory of Streamvision over the forthcoming decade will be profoundly shaped by the convergence of cloud computing, artificial intelligence (AI), and automation technologies. This strategic foresight positions us to transcend traditional paradigms and usher in an era defined by innovative solutions and enhanced customer engagement protocols.

The continual expansion of server, network, and global infrastructure capacities will serve as catalysts for our evolution, affording us the opportunity to embrace emerging technologies and refine customer-facing interactions. Foreseeably, manufacturers will integrate AI capabilities into endpoints, heralding a new era of unparalleled automation and personalized customization.

Consider the paradigm-shifting example of Deep Blue, the historic computer system that famously defeated chess grandmaster Garry Kasparov in 1997. Originally occupying an entire room filled with servers in its first iteration, this groundbreaking technology has since undergone a remarkable metamorphosis, evolving from a full server rack to its present-day manifestation, compact enough to reside within the confines of a standard chessboard. This trajectory mirrors the anticipated evolution of technology over the next decade or two, wherein the transformative power of innovation will render contemporary solutions akin to relics of a bygone era.

Chris Musty
Systems Engineer