Skip to Content

NotebookLM introduced to CWRU community as an educational AI resource

NotebookLM, an artificial intelligence research tool developed by Google, is now available to all Case Western Reserve University students, faculty and staff. NotebookLM is meant to be an educational resource. Instead of training the AI on everything that exists on the internet, NotebookLM utilizes sources uploaded by the user—PDFs, websites, videos, etc. The model then offers written and audio summaries of the information, comparatively analyzes sources, generates practice quizzes and more.

According to Tron Compton-Engle, senior director of Client Experience for University Technology ([U]Tech), several faculty members requested access to NotebookLM during the fall semester. 

“We didn’t release NotebookLM at that time because Google designated the technology as ‘experimental,’ meaning the company wasn’t committing to the quality of service or its long-term availability,” Compton-Engle said. In December, Google made NotebookLM a production service, published regular terms of service and established a free version of the application. After these changes were made, [U]Tech’s Cloud Governance Committee decided to release NotebookLM to the campus community. 

Compton-Engle also noted the benefits of accessing technology such as NotebookLM for all members of the university. “We understand that the students already have access to these tools through personal accounts. However, with NotebookLM, we felt having it available from a CWRU account could have benefits for faculty and staff who want to use it for their work” he said. 

CWRU does not have one defined AI policy, but [U]Tech offers several generative AI services and applications to the university community beyond NotebookLM, including Microsoft Copilot and Zoom AI Companion. Jeffrey Capadona, vice provost for innovation at CWRU, said that there is no one decisive AI policy at the university, as faculty are meant to have autonomy over their courses, but that doesn’t mean that conversations about AI shouldn’t happen. “Our strategy remains to provide faculty, staff, and students with opportunities outside the classroom to engage in diverse educational and skills training opportunities to build AI literacy” he said. “Like any new tool, considerate and ethical use is important to be taught. This is an important reason why AI in education cannot be simply ignored and should be transparently discussed in each course.”

Cognitive science Professor Mark Turner, a member of CWRU’s AI Task Force, is optimistic about the use of NotebookLM in the classroom. Turner is teaching two classes this semester and has introduced his students to NotebookLM as an optional resource. By making the use of NotebookLM optional, he hopes to encourage student exploration of university-moderated generative AI. 

According to Turner, the introduction of AI to the workforce has led to a distinct separation. “We are creating a digital divide of people who are comfortable with AI and people who aren’t,” he said. “That AI divide is going to be really crucial for the future of work.” 

Turner believes this divide is often widened because of fear. For this reason, he believes that it is vital for students to be trained in and explore the world of Generative AI without apprehension and external pressure. “One of the things that I think we need to pay attention to in the professoriate is how to allay fears,” he said. “You’re going to make a ton of mistakes, and it doesn’t matter.”

Third-year computer science student Raaghuv Vazirani has used NotebookLM on occasion but doesn’t think it is as useful as other AI models such as ChatGPT. “It’s not as robust, it can’t answer general questions, and its responses aren’t nearly as good,” he said. “I don’t think it’ll be that crazy of a game changer in the learning environment. The tool has been available for a while now, and it’s genuinely not as good of a ‘chatty’ response model as the other ones that freely exist on the market.” Vazirani acknowledged that NotebookLM has some benefits that other AI models lack. “I think the most promising aspect of NotebookLM, is this idea that you can guarantee where the model is pulling education from, to make sure it doesn’t hallucinate, and gives you good information pulled from specific resources,” he said. 

Vazirani believes that AI usage should not be the same across classes and disciplines but rather policies must be built on an understanding of the advantages and disadvantages of AI models. “People will use AI no matter what—and if we’re not helping educate people on how best to use it, then that’s a failure of the education system,” he said. “I think it all comes back to how we use them, how much fear-mongering there is, and whether or not people understand how they work.”

However, not all CWRU students are embracing AI tools. One student, who chose to remain anonymous, noted that they would not use AI models such as NotebookLM. “I honestly don’t think these sorts of generative artificial intelligence products have any place in higher education, and strongly dislike that the university continues to try normalizing them,” they said. “I can think of plenty of other things that would have been a better use of whatever contract value is now being spent on this instead.”

On Feb. 25 from 3-4:30 p.m., CWRU’s Veale Institute for Entrepreneurship will be holding a workshop with Google staff members. The workshop will address how founders and students can apply AI tools such as Gemini and NotebookLM in business endeavors. [U]Tech has also developed a Canvas course for faculty and staff members interested in learning more about generative AI. 

Additional reporting contributed by Executive Editor Shivangi Nanda.