Computers in the Workplace

Currently, I work in the insurance industry. Insurance has existed in some form or capacity long before the computer was economically viable to be at every employee's desk. Early years of insurance relied on pen, pencil, paper, stamps, envelopes, and everything else manual. At my current company, we still sometimes use paper (for billing, notes, and writing down phone numbers). Still, we have generally adopted using computers to perform the day-to-day tasks required of an insurance agency. Instead of having a handwritten spreadsheet to reference the price for a new customer policy, agents now plug in all of the necessary information into our policy management system and let it determine the policy price automatically.

Is Computer Literacy Important?

Being able to enter the data correctly is imperative to ensure that our system operates as intended. Despite the safeguards in place, there have been times when agents have filled out forms incorrectly, which has caused issues downstream when processing their forms. Understanding what types of information are necessary, knowing what to do when your submission fails, sending courteous emails that follow traditional netiquette, and using Microsoft Teams to organize your calendar all require some form of computer literacy. Grasping computer literacy is a requirement to work in insurance; it is no longer optional.

People Working on a Computer (Courtesy of Microsoft Stock Images)

What Does the Future Hold?

With the advent and acceptance of cloud infrastructure, I foresee that insurance will likely flock to using more cloud offerings as they modernize their systems. Furthermore, artificial intelligence is just scraping the surface of what it can do currently; I imagine we will be using AI to help determine new policy costs, help be proactive when there are system anomalies, and even refactor code between languages (I believe IBM and others already have products to convert COBOL code to Java code to enhance portability). If AI were to be trained on only your system and some base components required to let it work as intended, AI could significantly improve processes written decades ago to be more efficient. Hardware, as it always does, will continue to maximize its performance. Although the distance between two transistors is reaching a limit quickly, this does not mean enhancements in other forms have not occurred. Quantum computing looks promising, and I hope to see a resurgence in optimization as an art soon. Back when resources were limited, programmers and developers had to get creative when dealing with the limitations. Due to how quickly hardware has improved, developers have sometimes decided that "it works" is more important than "it works well;" this is not always the case, but one only has to look at some of the most recent AAA titles that struggle to run on high-end hardware to find that optimization is a lost art to some.

Conclusion

Although the future of computing may not be written in stone, I hope we see a healthy dose of interest alongside skepticism for implementing new and exciting tools such as AI and cloud infrastructure. I know programmers and some system administrators fear these tools can fully replace them and their jobs will be obsolete, but I believe it may take a decade or two to become a reality. In the meantime, researching and learning how to best use these new tools will be critical to ensuring we can continue growing in our computer careers.

Comments