Presentation Title: Developing a Preliminary Algorithmic Impact Assessment: AI For Military Simulation Training

Presenter(s): Jolani Rhodenizer, School of Information

Abstract: Impact assessments for AI are a relatively new endeavor put in place in various industries to ensure a level of accountability, transparency and recourse for stakeholders directly and indirectly impacted. The use of AI in military simulation training is an application of particular ethical concern because of the nature of the intended outcomes of use. The purpose of this presentation is to lay the foundation for the development of an Algorithmic Impact Assessment of the use of AI in simulation training. The Questionnaire Model will be recommended for use in creating a governance tool that can effectively and efficiently ensure the legality, tolerable impact on both military users and civilians, and appropriate level of transparency. The factors of data availability, classified datasets and transparency in development may result in high-risk impacts of AI for simulation training. The duty of review board consultation may necessarily fall to expert review panels, both internal and external, rather than the public input generally considered a foundational aspect of impact assessments. Ethical concerns of trauma for both users and reviews will be considered, suggesting that public input generally may not be useful or appropriate to this context of use.

Link to Recorded Presentation:

2 thoughts on “(2023) Developing a Preliminary Algorithmic Impact Assessment: AI For Military Simulation Training

  1. Dr. Jessica Bushey

    Well done Jolani! I found your topic very interesting and relevant for so many professions going forward. You’ve given us a lot to think about.

    • Jolani Rhodenizer

      Thank you Dr. Bushey. When it comes to AI, we are looking at ‘if’ in the rearview mirror. My purpose was to engage with the topic cognizant of the need to prepare for ‘when’.

Leave a Reply

Your email address will not be published. Required fields are marked *