A long line of students wrapped around Texas A&M University’s academic plaza in early October to receive free training from Google employees on how to use the company’s artificial intelligence tools, such as its chatbot, Gemini, and its research assistant, NotebookLM.
That same day, about 400 faculty members huddled in a campus building for deeper training from Google on how they could use AI tools to improve teaching and learning in their classrooms and how to effectively and ethically help their students use them as well, said Shonda Gibson, Texas A&M System’s chief transformation officer.
The daylong event was part of Google’s three-year $1 billion initiative to support AI education and job training programs throughout the U.S. The initiative, which launched in August, supports the tech giant’s AI for Education Accelerator that provides higher education students and educators with free access to tools and training and aims to create a community of institutions sharing best practices.
Texas A&M is one of over 200 higher ed institutions that have signed up for Google’s accelerator, according to Lisa Gevelber, founder of Grow with Google, the company’s workforce development campaign. They include higher ed systems like the University of Texas and University of North Carolina, as well as large institutions like the University of Pennsylvania, University of Michigan and University of Virginia.
“Every student deserves access to the digital tools and the skills and training to set them up for success. And this is our commitment to supporting that,” said Gevelber.
The initiative comes as colleges race to ensure their students are prepared to enter a workforce that is becoming increasingly shaped by AI.
“It’s not just about using the tools,” said Gibson. “We really want our students to have the best experience possible so that they’re fully prepared whenever they leave us to go on and do whatever they’re going to do in their future.”
Professors that integrate AI into their lessons should follow guidance on how to use it to further student learning, said Alexa Joubin, director of George Washington University’s Digital Humanities Institute.
Without that guidance, students risk using AI as a shortcut by having it summarize information for them instead of actually reading the materials presented and experiencing their lessons, said Joubin.
Meanwhile, recent research suggests AI could be detrimental to students' skills and outcomes.
A Massachusetts Institute of Technology study released in June found that using AI tools to write essays can impact critical thinking skills and lead to lower cognitive performance.
Over four months, study participants who used AI tools to write essays underperformed at “neural, linguistic, and behavioral levels” compared with those who didn’t, raising concerns about the long-term educational implications of relying on the technology, the study found.
Students are essentially “outsourcing key cognitive tasks to AI,” said Joubin.
The $1 billion initiative
The Texas A&M System joined Google’s initiative, Gibson said, because officials viewed the tech behemoth as the only company offering assistance and guidance at that level.
Gibson also pointed out the free access to normally paid versions of Google tools, which will be available over the next two years to students attending the system’s 12 institutions.
Google’s tools can act as a personal tutor for students to help them work through problems and learn material in a customized way, said Gevelber.
Gemini, for example, has a guided learning feature that can accommodate their learning needs, said Gevelber. The guided learning tool, for example, asks students probing and open-ended questions to spark discussions and dig deeper into the subjects, and it also introduces images, diagrams, videos and interactive quizzes to help them learn topics.
And NotebookLM allows students to upload lecture notes or class reading materials so it can create personalized quizzes, study guides, summaries and even AI podcasts that present the information in an engaging way, Gevelbar said.
“I’m really blown away by how much Google has offered support for all this work,” said Gibson. “They’ve had a real concentrated, strategic approach to supporting higher education and K-12.”
Texas A&M system students and faculty can also access Google’s Career Certificates, which offers courses on topics such as AI literacy and AI prompting courses that are operated by Coursera
The Career Certificates program is available for free to all students at institutions participating in Google’s accelerator. That program provides training for entry-level jobs in career fields like data entry, cybersecurity, analytics, information technology support and project management, Gevelber said. Since it launched in 2018, more than 1 million students have received certificates, Gevelber said.
Using AI with ‘strategic intentionality’
The Google initiative is not the Texas A&M System’s first foray into AI initiatives. System leaders have invested in AI training in recent years, in part because they believe the tools help improve grades and lower course withdrawals, said Gibson.
The system teaches professors how to effectively incorporate AI into their courses through an intensive faculty training effort with the Association of College and University Educators that launched five years ago, said Gibson. Seven of the system’s universities also provide students with AI training via certificates on Coursera’s learning platform through a separate initiative, part of a two-year trial that the system is paying for, she said.
The university aims to allow students to test and try things, make mistakes, and know how to use different technologies so they are prepared to enter the workforce, Gibson said. To accomplish that, the system needs to train its faculty and staff how to use generative AI tools so they can be deliberate about how they embed them in courses and academic programs, she said.
“It is our responsibility to teach students to use it ethically and effectively and we have to do that with a lot of strategic intentionality,” said Gibson.
Professors shouldn’t try to avoid AI, but they should be purposeful about how they allow students to use the tools, said Joubin, who is also an English professor at George Washington.
In disciplines that require students to pay particularly close attention to word choices, such as humanities, outsourcing cognitive tasks to AI can be particularly problematic. That’s because the words people use to describe the world around them — such as a piece of art — can affect how others perceive it, said Joubin.
“It is our responsibility to teach students to use it ethically and effectively and we have to do that with a lot of strategic intentionality.”

Shonda Gibson
Chief transformation officer, Texas A&M System
Joubin uses a custom open source AI program in her classes that draws only from her own lectures, publications and materials, she said. It’s also narrowly tailored — the technology isn’t used to help students write essays but rather to help them think about words or topics more deeply, she said.
If a student asks it a question, for example, the tool will respond with additional questions. She also asks students to rewrite an idea or a sentence in as many different voices as they can, such as from a lawyer, someone at a corporation or a poet. Then she asks the students to use AI to carry out the same task — giving them even more examples of voices.
“We don’t use AI for any task that requires accuracy, such as information retrieval," Joubin said, adding that they instead use it as a "thinking partner."
In other disciplines, like computer science, students could use AI to write code, said Joubin. AI tools, such as GitHub Copilot, can assist people who are coding.
But asking AI to write code from scratch is “an utter waste of time,” and not a substitute for students building domain expertise themselves, Joubin said. The quality of the AI-generated code is often lacking — and it may actually take longer and use more resources to create, she said.
“There are many ways to code something, to execute a task,” said Joubin. “For people who think that STEM needs AI, will just do well with AI without this kind of critical thinking or reflection, they are being deceived.”
Google’s guided learning tools, meanwhile, aim to help people challenge their thinking — giving them different points of view and expanding their thinking about a particular problem, said Gevelber.
At Texas A&M, professors will adapt their AI policies to different situations — ranging from banning students from using the tools in their classrooms, allowing students to use them cautiously under their guidance, and giving students full access to them after some basic instruction, she said.
“That is going to differ depending on the type of course, the level of the course, what it is that you're trying to accomplish with your students,” said Gibson. But “Gen AI is something we have to be open and transparent about. How do we want our students to use it?”