AI and Human Relationships at FGCU
August 18, 2023 / Bill Reynolds
AI and Human Relationships at FGCU
(Adapted from remarks made at the Provost鈥檚 Annual Retreat, August 1, 2023.)
I鈥檝e been following and sometimes participating in the conversation about AI and higher ed over the last several months, and I鈥檝e noticed two themes that seem to be getting quite a bit of attention. The first is about academic integrity and is encapsulated in the following question: How do we prevent students from using ChatGPT or another AI tool to produce assignments we wish them to complete on their own AND 鈥渃atch鈥 them when they do inappropriately or surreptitiously use AI? The second theme, I鈥檒l call 鈥淎dapting to the marketplace.鈥 This theme is related to the idea that the 鈥済enie is out of the bottle鈥 and, therefore, we have to find ways to integrate AI tools into our teaching or at least help students prepare to be familiar with and able to use these tools in the work they will do in their chosen careers.
To be fully transparent, I鈥檓 not especially interested in the first theme, to the extent that it entails helping faculty increase their level of surveillance over students. And, among those in the know, there seems to be a growing consensus that AI detection is a losing battle. Also, while the 鈥渁dapting to the marketplace鈥 theme is somewhat closer to the work I do in the Lucas Center to support faculty鈥攊n that it is directly related to teaching and learning鈥擨 don鈥檛 believe the function of our work as educators is to meet the labor needs of the marketplace. So, I don鈥檛 find this second theme especially compelling either.
This is not to say, however, that I don鈥檛 believe that AI has the potential to be transformational in higher ed. I just hope that it will be transformational in ways that are as yet underrepresented in the discourse on AI and education, and I will turn my attention to a third theme that I hope will occupy our attention in substantive ways in the coming academic year.
What I find to be missing from the conversation about AI and higher ed is recognition of the ways in which AI enhances our opportunities for relationship building with students. In a recent article in the Atlantic Monthly, Adrienne LaFrance (2023) wrote:
Now is the time鈥o recommit to making deeper connections with other people. Live videochat can collapse time and distance, but such technologies are a poor substitute for face-to-face communication, especially in settings where creative collaboration or learning is paramount. The pandemic made this painfully clear. Relationships cannot and should not be sustained in the digital realm alone, especially as AI further erodes our understanding of what is real. Tapping a 鈥淟ike鈥 button is not friendship; it鈥檚 a data point. And a conversation with an artificial intelligence is one-sided鈥攁n illusion of connection (para. 17).
In a similar vein, in early 2023 David Brooks wrote in the New York Times:
[AI] is missing a humanistic core. It鈥檚 missing an individual person鈥檚 passion, pain, longings and a life of deeply felt personal experiences. It does not spring from a person鈥檚 imagination, bursts of insight, anxiety and joy that underlie any profound work of human creativity (para. 4).
Finally, in a 2011 New York Times article titled 鈥淲hat is College For?鈥, philosopher Gary Gutting of Notre Dame wrote:
First of all, [colleges] are not simply for the education of students. This is an essential function, but the raison d鈥櫭猼re of a college is to nourish a world of intellectual culture; that is, a world of ideas, dedicated to what we can know scientifically, understand humanistically, or express artistically (para. 7).
He went on to write:
Students鈥eed to recognize that their college education is above all a matter of opening themselves up to new dimensions of knowledge and understanding. Teaching is not a matter of (as we too often say) 鈥渕aking a subject (poetry, physics, philosophy) interesting鈥 to students but of students coming to see how such subjects are intrinsically interesting. It is more a matter of students moving beyond their interests than of teachers fitting their subjects to interests that students already have. Good teaching does not make a course鈥檚 subject more interesting; it gives the students more interests 鈥 and so makes them more interesting鈥T]he truth is that, for both students and faculty members, the classroom is precisely where the most important learning occurs (paras. 9-10).
So, if we appreciate the value of the humanistic responses of Brooks and LaFrance to AI and of Gutting鈥檚 philosophical perspective on the purpose of higher ed and the transformative potential of the classroom experience, what are the implications for us as teachers, students, administrators, advocates, staff鈥攅ssentially anyone involved in the university community鈥攁s we attempt to come to terms with the impact of AI on our work? I noted earlier that my hope is that our current preoccupation with AI will lead us to seek deeper, more meaningful relationships with our students, and this is the point on which I will conclude.
In the service of building trust and rapport in the classroom I believe we can and should approach AI as an object of intellectual inquiry with our students, rather than as a tool to be mastered or an enemy to be subverted. We can engage students in a collaborative process to develop shared guidelines for the use of AI in our courses, while increasing our transparency about the ways in which vital transferable skills鈥攕kills such as critical thinking, oral and written communication, teamwork, leadership, professionalism鈥攁ll of which are identified in FGCUs newest Quality Enhancement Plan鈥攎ay be inhibited or constrained when we outsource or delegate our intellectual effort to a machine.
I agree with the following point made by LaFrance (2023) in her Atlantic article, and I believe it is fully aligned with our fundamental purpose as educators: 鈥淲e should trust human ingenuity and creative intuition, and resist overreliance on tools that dull the wisdom of our own aesthetics and intellect鈥 We can and should layer on technological tools that will aid us in this endeavor, but never at the expense of seeing, feeling, and ultimately knowing for ourselves鈥 (para. 24).
In the end, then, I hope that A.I. will force us to reflect more deeply on what it means to be human and what we in the university community mean to our students鈥攁nd they to us. It is the depth and commitment of our shared goals and connections to one another that can not be replicated by AI and that are absolutely necessary for the achievement of our educational, personal, and professional aspirations.
References
Brooks, D. (2/2/23). In the age of A.I., major in being human. The New York Times. Retrieved from .
LaFrance, A. (July/August, 2023). The coming humanist renaissance. Atlantic Monthly. Retrieved from .
Gutting, G. (1/14/11). What is college for? The New York Times. Retrieved from .
Share Post