At any given point in time, organizations that use technology as an enabler are always at crossroads of developing new systems, upgrading the old ones or just supporting the ones that are planned to be sunset. When planning these upgrades and new systems it would be prudent to keep an eye on Design for Digitization. This would entail understanding how Bot technology works and ensuring future systems are built as loosely coupled yet highly cohesive units.

Think of a situation where validating doctor’s credentials is a semi-automatic process. You look for various licenses, certifications, malpractice history and criminal history among other things. If your current process doesn’t have basic provider/doctor details saved in a digital format but relies on paper copies, you are now asking for more complexity. Implementing a Bot to automate this process will require one more complexity of maybe use OCR to read the paper copies. The point is, bots create a lot of value if you start creating an ecosystem that allows bots to thrive and communicate seamlessly across systems and data sources. Essentially Security, Data Integrity and Scalable Infrastructure should be the topmost aspects that need to be reimagined if Bots are to take over the operations and have a great impact on Design for Digitization.

The UI/UX efforts should consider various ways a desktop bot can read data from the screen. Virtualization if any should be considered only if that is the only way to go, as most desktop bots have to be trained differently when accessing applications over say Citrix of similar environments. Web pages need to be well structured with each element labelled correctly.

That brings us to testing. Testing your systems for Bot usage has to be considered as an additional step in your product/system development. Again security testing along with load testing would be critical in the planning for quality assurance.

One of the biggest misunderstandings is that bots will always operate at lightning speeds and thus are significantly efficient than their human counterparts. Wrong! If you consider most of the desktop bot implementations, they use the same underlying systems the humans use. If the underlying systems are slow the bot is not going to make it faster! In fact, there will be times when the Bot will be slower than a human operator; this might be because of the way the technology works or the speed is compromised to achieve better accuracy or to achieve better coverage of the different exceptions in the process

Lastly, while thinking about the Design of Digitization, have a careful review of your licensing structures for third party systems. Most big vendors have realized the rise of bots and thus have started charging a separate license for use of bots on their systems in addition to the existing licenses. Plus, some of these vendors have ways to detect a Bot and block the bot from accessing the system unless a special license or feature in enabled.

In conclusion, the digital workforce, or Bots are here and they are quickly becoming a great resource to tackle the time, cost and resource constraints of an organization. Ensure that you evaluate the processes before your start a PoC, understand the TCO before and after digitization and keep an eye on future system designs to ensure you have an eco-system where automation/bots can thrive.