A New Scale Defining Human Involvement in Technology Communities from an Ethical Standpoint
Keywords:Human in the loop, ethics, artificial intelligence
We undertake an expansive examination of the terms Human-in-the-loop, Human-on-the-loop, Human-out-of- the-loop, and Human-in-command, as used recently in AI development, relative to their ethical implications and implicit assumptions. Tracing the history and development of the ‘Human ...’ terms, we explore the contexts and uses present from their beginnings. We follow with a discussion of the ethical outlook which the origins of the terms and their recent rebranding for AI development under the notion of oversight have engendered. Drawing upon certain insights of Bruno Latour for support, we suggest that Latour’s ‘forgotten ethical intermediaries’, folded into our technologies, have their analogue in the view of the human as a component of automated systems alternating with a role of human oversight. We argue that a more ethical human relation to technology can be recovered through an expansive emphasis on human participation in technology producing communities. Finally, we present a flexible new scale, the IGP scale, to rate such participation.
Alper, Paul. “An Open-Loop Procedure for Process Parameter Estimation Using a Hybrid Computer.” Theory of Self-Adaptive Control Systems: Proceedings of the Second IFAC Symposium on the Theory of Self-Adaptive Control Systems, September 14-17, 1965, National Physical Laboratory, Teddington, England. Edited by P.H. Hammond. Springer Science, NY. 1966. https://link.springer.com/book/10.1007/978-1-4899-6289-8
Anderson, Marc M. Hyperthematics: The Logic of Value. New York. SUNY Press. 2019.
Ayuso, Damaris M. “Topic Session on Discourse.” Proceedings of the 5th conference on Message understanding (MUC5 '93), p. 345, 1993, doi.org/10.3115/1072017.1072051
Caid, W.R. and Webb Simmons. “Minicomputer Assisted Reprogramming System (Mars)” Rept. for 15 Nov 77- 15 Nov 79, Defence Nuclear Agency, 1979.
Crawford, Kate. Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale. Yale UP. 2021.
Brown, Gordon S. and Donald P. Campbell. Principles of Servomechanisms: Dynamics And Synthesis Of Closed-Loop Control Systems. John Wiley & Sons, NY, 1948. https://archive.org/details/in.ernet.dli.2015.166797/page/n5/mode/2up?q=d
Bennet, Corwin, A. “Some experimentation on the tie-in of the human operator to the control loop of an airborne navigational digital computer system.” IRE-ACM-AIEE '57 (Eastern), 1957, pp. 68-71, doi/10.1145/1457720.1457732.
Batali, John. “How Much AI Does a Cognitive Science Major Need to Know?” SIGART Bulletin, vol. 6, 1995, pp. 16-19, doi/10.1145/201977.201985.
Birmingham, H.P. and F.V. Taylor. “A Design Philosophy for Man-Machine Control Systems.” Proceedings of the IRE, vol. 42, no. 12, 1954, pp. 1748-1758, doi: 10.1109/JRPROC.1954.274775.
Cummings, M.L., et al. ”The Impact of Human–Automation Collaboration in Decentralized Multiple Unmanned Vehicle Control.” Proceedings of the IEEE, vol. 100, no. 3, pp. 660-671, 2012, doi: 10.1109/JPROC.2011.2174104.
Day, Dwayne A. “Invitation to Struggle: The History of Civilian-Military Relations in Space.” Exploring the Unknown: Sellected Documents in the History of the U.S. Civil Space Program, vol. 2. External Relationships. Washington DC. 1996.
De Lemos, Rogério. “Human in the loop: what is the point of no return?” Proceedings of the IEEE/ACM 15th International Symposium on Software Engineering for Adaptive and Self-Managing Systems (SEAMS '20), pp. 165–166, 2020, doi.org/10.1145/3387939.3391597.
Docherty, B. et al. “Losing Humanity: The Case Against Killer Robots.” Communications of The ACM, vol. 42, 2012.
Fanni, R. et al. “Active Human Agency in Artificial Intelligence Mediation.” Proceedings of the 6th EAI International Conference on Smart Objects and Technologies for Social Good, pp. 84–89, 2020, doi.org/10.1145/3411170.3411226.
Gschwind, Robert T. “Control Dynamics of Human Tracking with a Viscously Damped Tracking Aid” Memorandum Report No. 2709, USA Ballistic Research Laboratories, pp. 1-28, 1976. https://apps.dtic.mil/sti/pdfs/ADA035455.pdf
Hammes, T.X. “Reality in Autonomous Systems: It starts the Loop” The Cove, 2020. https://cove.army.gov.au/article/reality-autonomous-systems-it-starts-the-loop (Accessed August 2021)
Héder, Mihály. “A criticism of AI ethics guidelines.” Információs Társadalom, vol. 20, no. 4, pp. 57-73, 2020, doi.org/10.22503/inftars.XX.2020.4.5.
High, Peter. “Carnegie Mellon Dean of Computer Science On the Future of AI” Forbes, 2020. https://www.forbes.com/sites/peterhigh/2017/10/30/carnegie-mellon-dean-of-computer-science-on-the- future-of-ai/ (Accessed September 2021)
HLEG. Ethics Guidelines for Trustworthy AI. European Commission, 2019. https://ec.europa.eu/futurium/en/ai- alliance-consultation.1.html
Hopkinson, William C. and José A. Sepúlveda. “Real time validation of man-in-the-loop simulations.” Proceedings of the 27th conference on Winter simulation (WSC '95), pp. 1250–1256, 1995, doi.org/10.1145/224401.224804.
Latour, Bruno. “La fin des moyens.” Reseaux, vol. 18, no. 100, pp. 39-58, 2000. https://www.persee.fr/doc/reso_0751-7971_2000_num_18_100_2211
--- What is the Style in Matters of Concern?. Van Gorcum. Assen, Belgium. 2008
Loy, Patrick. “The method won't save you: (but it can help).” SIGSOFT Softw. Eng. Notes, vol. 18, no. 1, pp. 30-34, 1993, doi.org/10.1145/157397.157398.
Merat, N., Seppelt, B., Louw, T. et al. “The “Out-of-the-Loop” concept in automated driving: proposed definition, measures and implications.” Cognition, Technology & Work. vol. 21, pp. 87–98, 2019. doi.org/10.1007/s10111-018-0525-8
McLaughlin, Margaret L. et al. “The haptic museum.” Proceedings of the EVA 2000 conference on electronic imaging and the visual arts. 2000. https://infolab.usc.edu/DocsDemos/eva2000.pdf
McRuer, Duane T. and Ezra S. Krendel. “The human operator as a servo system element.” Journal of the Franklin Institute, vol. 267, no. 5, pp. 381-403, 1959, doi.org/10.1016/0016-0032(59)90091-2.
Naval Research Laboratory. “Report of Naval Progress.” 1958. https://www.google.fr/books/edition/Report_of_NRL_Progress/z7I1AAAAMAAJ?hl=en&gbpv=1&dq=%22 human+in+the+loop%22&pg=RA2-PA20&printsec=frontcover
Porathe, Thomas and Johannes Prison. “Design of human-map system interaction.” CHI '08 Extended Abstracts on Human Factors in Computing Systems. Association for Computing Machinery, pp. 2859–2864, 2008, doi.org/10.1145/1358628.1358774.
Reynolds, Paul F. 1988. “Heterogenous distributed simulation.” Proceedings of the 20th conference on Winter simulation (WSC '88), pp. 206–209, 1988, doi.org/10.1145/318123.318190.
Rissland, Edwina L. and Jody J. Daniels. “A hybrid CBR-IR approach to legal information retrieval.” Proceedings of the 5th international conference on Artificial intelligence and law (ICAIL '95), pp. 52–61, 1995, doi.org/10.1145/222092.222125.
Sanders, Daniel S. “Social Work Concerns Related to Peace and People Oriented Development in the International Context.” The Journal of Sociology & Social Welfare, vol. 15, no. 2, pp. 57-72, 1988.
Shea, J. “Systems Engineering for Manned Space Flight” 2nd Manned Space Flight Meeting. American Institute of Aeronautics and Astronautics. NY. 1963.
Stieber, Michael E. et al. “Control of Robotic Systems on the Space Station.” IFAC Proceedings Volumes, vol. 31, no. 33, pp. 89-94, 1998, doi.org/10.1016/S1474-6670(17)38392-1.
Stotz, Robert. “Man-machine console facilities for computer-aided design.” Proceedings of the May 21-23, 1963, spring joint computer conference (AFIPS '63 (Spring), pp. 323–328, 1963, doi.org/10.1145/1461551.1461590.
Stouch, Daniel, et al. “Coevolving collection plans for UAS constellations.” Proceedings of the 13th annual conference on Genetic and evolutionary computation (GECCO '11), pp. 1691-1698, 2011, doi.org/10.1145/2001576.2001804.
Wagner, Markus, “Taking Humans Out of the Loop: Implications for International Humanitarian Law.” Journal of Law Information and Science, vol. 21, 2011. https://ssrn.com/abstract=1874039
Wixon, Dennis and John Whiteside. 1985. “Engineering for usability (panel session): lessons from the user derived interface.” SIGCHI Bull., vol. 16, no. 4, pp. 144–147, 1985, doi.org/10.1145/1165385.317484.