Many of you may recall the science fiction writer Isaac Asimov’s book, I, Robot. Published in 1950, it imagines the widespread use of robots in society and the moral implications that creates. Needless to say, Mr. Asimov was ahead of his time.
Robotic Process Automation (RPA) is a pervasive technology deployed across many verticals. It saves human workers from performing countless hours of sequential, repetitive tasks that don’t add significant value to the business. Wouldn’t it be great if you could take advantage of high-value RPA opportunities and replace tasks, not staff – without the risks posed by a rouge “Robby”?
While ‘bots’ have been around for some time, their use can introduce new threats. Let’s face it, a bot is just another account or identity, it’s just a ‘non-human’ or machine account, with access rights and entitlements. Because of those rights and entitlements, machine accounts need to be governed and managed just like human identities. Those permissions can allow a robot to launch and operate other software within an IT stack, manipulating data across platforms and applications. And if you don’t think that anything can go wrong with that, then you haven’t seen The Terminator lately.
To deliver the proper level of accountability, strong governance of machine accounts is essential. After all, the ownership of and policies and rules around bot accounts and their roles within business processes is just as critical as managing human identities, and maybe even more so, since a bot can work 24/7 and is exponentially faster than a person at executing tasks. So, knowing what your machine accounts can do, what decisions they can make, and what actions they can take is important. To minimize risk, keeping tabs on dormant machine accounts, how long they’ve been dormant, which bot accounts have been issued and not used, and what role they are assigned are all important factors in managing machine accounts and reducing vulnerabilities.
Automation of the provisioning and de-provisioning of machine accounts can minimize risks that can result from bot accounts. Imagine an instance when there’s a temporary need for robots to “pitch in” to execute a short-term project. No problem! Simply time-bind the robot army’s access. When the project closes, access to applications or platforms gets automatically revoked, and the bot account is either deleted or placed as inactive. (This automated provisioning and de-provisioning process should have its own set of related IAM ‘rules’ itself, so security teams know what can and can’t be automatically activated and deactivated).
If your organization is looking to address governance and access management risks of machine accounts in the context of RPA, check out RSA® Identity Governance and Lifecycle. Please contact us to discuss how RSA can help you.