Disaster Recovery Journal Spring 2024

Assessing the Risks of AI Dependence in Organizational Resilience By NATHAN SHOPTAW & JOHN HILL O rganizational resilience professionals are entrusted with safeguarding an organization’s ability to endure adversity and dis ruptions. Artificial intel ligence (AI) has shown promise in enhancing these capabilities but also raises concerns. It is essential to understand the potential pitfalls associated with the grow ing reliance on AI in this context. Diminished Human Decision-Making The use of AI in organizational resil ience can streamline processes and pro vide data-driven insights. However, there is a risk of diminishing human decision making. Over-reliance on AI may lead to a loss of control over critical resilience strategies. Professionals should carefully balance automation with human judgment. Automation Bias AI systems operate based on algo

rithms and historical data. In some cases, they might develop an “automation bias,” where human decision-makers begin to rely blindly on AI recommendations. This diminishes critical thinking and judgment, often leading to complacency. Maintain a robust human oversight system to ensure any decision-making aided by AI is subject to review by human experts. One should also encourage a cul ture of questioning within the organiza tion. Challenge AI recommendations and involve human expertise in decision pro cesses. Loss of Control Overreliance on AI can lead to a loss of control over critical processes. As AI takes on more responsibilities, humans might become detached from the decision-mak ing process, rendering them less capable of handling unexpected scenarios. Define clear boundaries for AI and human responsibilities and ensure there are scenarios where AI defers to human decision-makers. Ensure human experts

DISASTER RECOVERY JOURNAL | SPRING 2024 35

Made with FlippingBook Digital Publishing Software