Recently, I was asked to assist one of the back-office administration teams struggling with time-consuming, repetitive tasks. These tasks, while essential, were prone to human error and involved data entry, data validation, and system integration activities, among others.
In assessing these processes, I recommended implementing Robotic Process Automation (RPA)—often referred to as RPC—as a proof of concept. RPC is a technology that emulates user behavior through scripts, allowing interaction with desktop applications or websites. The primary advantage is its ability to free up personnel for higher-value tasks, while the computer efficiently completes essential tasks.
To help our back-office administration team overcome their challenge, we opted for Microsoft's Power Automate Desktop (PAD). PAD is an end-to-end automation solution designed for a wide range of tasks.
If you're using Windows 10, PAD can be downloaded for free from the Microsoft store. However, be aware that your corporate policies might prevent you from installing it.
As a veteran in automating testing scripts, I adopted a similar process here. Here are the steps I followed:
Understand Your Objective and Calculate the ROI: Before embarking on any automation journey, it's crucial to identify why you're doing it and what you hope to gain. Are you trying to free up staff time? Improve accuracy? Increase speed? Having a clear objective will guide your process and help you make key decisions along the way. Next, assess the potential Return on Investment (ROI). Consider the amount of time and resources that will be spent on automating the process versus the time and cost savings the automation will provide once implemented. For example, if it's going to take two months to automate a process that is used only twice a year, the ROI may not justify the effort and expense. However, if a task is performed daily and requires several hours of manual work, the ROI from automation could be significant. Remember, automation is not always about direct financial returns—it can also provide value in the form of improved accuracy, consistency, and employee satisfaction.
Document the Process: Thorough documentation is a prerequisite for successful automation. The first step involves detailing every aspect of the process you plan to automate, ensuring nothing is left out. Capture all interactions, decision points, and exceptions. This documentation should serve as the blueprint for your automation design.
Recording a video of the process being executed manually can be an excellent tool for this. A video can provide more context than a textual description, illustrating nuances like where and how long the system pauses, how users navigate menus, and the precise sequence of inputs. This visual aid can be particularly helpful when you're dealing with complex systems or processes.
Once you've documented the process thoroughly, review it with the end users and stakeholders involved in the process. They can provide critical insights and validate the documented steps, ensuring that you've captured all the details accurately. This step is crucial because even minor oversights can result in automation failures or inefficiencies.
Lastly, remember that this document should be dynamic—update it as you make changes to the automation or as the underlying manual process evolves. This way, your documentation will always be in sync with the actual automated process.Account for One-Time Pins and Two-Factor Authentication: These security measures can add a layer of complexity to automation. However, automation accounts can often be configured to bypass two-factor authentication or set to use a constant one-time pin (OTP). This can simplify the process significantly. However, it's essential to balance convenience with security, so implementing alternative security measures might be necessary to mitigate potential risks. For instance, this could include IP whitelisting, secure network access controls, or additional encryption for sensitive data. Always consult with your IT security team to ensure any changes adhere to your organization's security policies and standards.
Engage with the End User: Communication is crucial during the automation process. Regular catch-ups with the user who will be using the script can lead to beneficial suggestions and innovative solutions. In our case, these daily discussions paid off significantly. The business user suggested generating built-in reports to identify the records that needed processing. This insight allowed us to circumvent the complexities of Optical Character Recognition (OCR) on screen objects, simplifying our automation workflow. This real-world example underscores the value of maintaining open lines of communication with your end users—they often have intimate knowledge of the system and can provide valuable insights to improve the automation process. So, don't forget to include them in your automation journey.
Identify Repeatable Steps: As you're documenting the process and planning your automation, pay special attention to any steps that are repeated frequently. These could be as simple as logging into a system, navigating menus, or filling out a form. In Power Automate Desktop, you can create subflows for these repetitive tasks.
A subflow is like a function or subroutine in programming: it's a series of steps that you can define once and then call multiple times from different parts of your main flow. Not only does this save you the time of having to recreate these steps every time they're needed, but it also makes your main flow cleaner and easier to understand. Plus, if you ever need to change the way one of these repeated tasks is performed, you can just update the subflow, and the changes will be applied everywhere it's used. In our project, one such subflow was created to terminate the application with a DOS command line. This was useful in maintaining state without interacting with the UI. This approach reduces dependence on UI, which may be more prone to changes and inconsistencies, thus making your automation more reliable and easier to manage.
Remember, creating reusable components in your automation workflows, like subflows, is a best practice and significantly contributes to the efficiency and maintainability of your automation projects.Leverage DOS and AI: When automating tasks, it's often more efficient and reliable to work with underlying data and systems directly, rather than through their user interfaces. For instance, rather than automating mouse clicks and keystrokes to copy data from one application and paste it into another, you could use DOS commands to extract the data from its source, transform it as needed, and load it into the target system. This approach, known as Extract, Transform, Load (ETL), is less prone to error and more resistant to changes in the user interface. It's important to note, though, that using DOS commands effectively requires a good understanding of command-line tools and scripting.
Artificial Intelligence (AI) can also be a significant ally in your automation efforts. Various AI models and tools can assist with different aspects of the automation process. For instance, AI can be used for writing regex expressions, identifying useful functions within the tool, and even troubleshooting issues you may encounter. Some AI models, like ChatGPT, Bing, Bard, or POE, are adept at text analysis and generation, while others can handle tasks like image recognition or prediction. By integrating AI into your automation workflows, you can solve more complex problems and create more intelligent, adaptable automation.Break Down Your Process into Separate Flows: As you move forward with your automation project, consider breaking down your overall process into separate flows, especially if the process is complex or lengthy. Each flow should accomplish a distinct task or stage of the process. This approach can help manage complexity, enhance readability, and make troubleshooting easier.
For example, in our project, we divided the process into three distinct flows: extraction, transformation, and processing. The first flow handled the extraction of data, the second one transformed that data as necessary, and the third one processed the transformed data.
To share data between these separate flows, we utilized write-to-file functions. Data was written into files by one flow and then read from those files by the next. This way, even if one flow failed or needed to be rerun, it didn't necessarily impact the others. Each flow could be tested and debugged independently, which made the development process more manageable.
This modular approach to automation is an excellent practice. It promotes reusability, where the same flow could be used in different processes, and maintainability, as it's easier to update or fix a specific flow rather than a large, monolithic one. Keep in mind that organizing your automation into separate flows may require more initial planning and setup, but the payoffs in terms of scalability and manageability are well worth it.
Implementing this methodology, I created a series of three distinct flows, each tailored to a specific phase of the process: data extraction, data transformation, and data processing.
The first flow, data extraction, was designed to pull all the necessary data from the original sources. It was configured to access different databases or files, extract the required information, and store it for the next stage. This flow eliminated the manual need to gather data from various sources, thus minimizing the risk of human error and boosting efficiency.
The second flow, data transformation, took over where the first left off. It was responsible for manipulating the extracted data to meet the needs of the final processing stage. Depending on the task at hand, the transformation could involve operations such as data cleansing, format standardization, and the calculation of new data fields. Automating this step ensured that the data was consistently prepared and formatted correctly, saving time and reducing the likelihood of mistakes.
The third and final flow was tasked with data processing. It used the cleaned and transformed data to perform the necessary operations, such as updating records, generating reports, or performing complex computations. This final step replaced the time-consuming manual effort with a swift, accurate automated process.
The outcome of this methodical approach was significant. We successfully automated tasks equivalent to two man-days per month, freeing up valuable time for the team to focus on more strategic, value-adding activities.
Moreover, this project had an unexpected, but very welcome, ripple effect. The robust and flexible framework we developed served as a catalyst for further automation within the admin team. Seeing the efficiency and accuracy gains from the initial project, they started utilizing the framework to automate other manual processes in their workflow. This additional benefit underscored the transformative power of well-planned and executed automation, boosting productivity and reducing error rates beyond the initial scope of our project.
In conclusion, Robotic Process Automation using tools like Power Automate Desktop can significantly streamline business operations and drive efficiency. Through the implementation of this technology, we managed to automate tasks equivalent to two man-days per month of manual processing.
The exciting part about this journey is that once the initial framework was developed, the admin team began using it to automate other manual processes. This cascading effect of automation empowers teams to create a more efficient work environment.
However, it's important to remember that automation is not a one-size-fits-all solution. The process of implementing automation requires a clear understanding of the tasks at hand, a careful analysis of the return on investment, and a good degree of collaboration with the end-users. Also, consider the complexities that may arise, such as dealing with security measures like two-factor authentication, and the need for alternate methods to access and process data.
Breaking down the process into manageable pieces and documenting each step thoroughly can go a long way in ensuring the success of your automation project. Remember to leverage the power of technologies such as DOS and AI to handle complex tasks and make your automation more robust and intelligent.
Automation is an ongoing journey. The initial setup can take some time, but once the foundation is laid, the potential for scale and efficiency is remarkable. It opens the door to automating more complex and higher-value tasks, constantly raising the bar for what can be achieved.
Embrace the journey and remember that every challenge is an opportunity for growth and learning. With each hurdle, you gain knowledge and expertise that you can apply to future automation projects. Happy automating!
Top comments (0)