Connect with us

AI Insights

Agencies now have access to no-cost AI platform from GSA

Published

on


Agencies now have access to a new artificial intelligence tool that has been test driven by the General Services Administration for the last eight months.

Through the USAi platform, agencies can take advantage of capabilities that include chat-based AI, code generation and document summarization.

David Shive is GSA’s chief information officer.

“We started with 10 users, and then we moved to 100 users, and then 1,000 users, and eventually rolled it out to all employees across GSA,” said David Shive, GSA’s chief information officer, in an interview with Federal News Network. “The business portfolio of GSA is very broad, and they started to apply this thing across those broad business domains. They started to do things like writing code that would satisfy some code development needs that they had completed in hours, not days, weeks or months. They started doing data analytics across multiple data sets, that would normally take days and weeks, and they were doing it in hours.”

Shive said the employees really started to show the impact of the USAi tool during Friday demonstrations last spring led by Stephen Ehikian, then the acting administrator and now deputy administrator.

Shive said 60% or 70% of those demonstrations were employees using USAi to “automate or augment the work that they were doing by reducing the drudgery of their day-to-day work, or making it so they didn’t need to swivel between five systems to do one thing or they’re writing code snippets to automate the drudgery of their work day.”

GSA is offering USAi as a no-cost shared service.

Agencies just have to sign a memorandum of understanding with GSA and determine what databases they want to integrate with the large language models.

Shive said agency customers will have access to USAi through a GSA-hosted single tenant cloud environment.

“We don’t need to be in the business of running other agencies’ technology, not because we would be bad at it, but because the business mission of each agency is very different. The risk profile of each agency is very different. It doesn’t make sense to have a one-size-fits-all for everybody,” he said. “We’ve developed a multi-tenant architecture where each tenant, the underlying infrastructure, is all the same, but each tenant is independent. Each agency has full control over the entire stack, including things like the provisioning of their users, assessing that telemetry and those behavior patterns across people and technology, and making risk management framework based decisions on what’s acceptable to them and what’s not.”

Access to six commercial AI models

Shive said several agencies are already interested in using the platform, but he wasn’t ready to say who or how many.

Through USAi, GSA is offering three different options and access to six different commercial AI large language models, including Amazon, Microsoft, Google, Meta, Open AI and Anthropic.

One of the services is a basic chatbot.

The second is the API layer to make it easier to integrate their datasets into the chatbot. “The connections and the mechanics and stuff are the same for every tenant, but how they configure those is up to each agency to determine,” Shive said.

The third service is the console, which is a management dashboard that lets each agency tailor the tooling and the create user requirements based on things like risk tolerance.

Shive said USAi received its authority to operate at the FISMA moderate level.

“The idea there is that we built something in what we call a model garden. The idea there is that not all models are the same. They all have different strengths and weaknesses, and we want agencies to be able to get access to whatever model they need that they think is suitable for the work that they’re trying to do,” he said. “Also knowing that those strengths and weaknesses are leapfrogging themselves on a very rapid pace, it is a wildly dynamic industry environment out there, this gives agencies the ability to try before they buy at scale. In that rapidly changing, highly dynamic space, they are able to point whatever work they’re trying to do at the highest value model. Value can be defined any number of ways. It’s not just cost. It’s usability, resistance to bias, resistance to cyber attack and other things. We give them unfettered, full visibility into that telemetry, and then they can make the decisions based on whatever their internal criteria is because that is the domain of chief information officers to make those decisions at the agency level. We just give them the tools to be able to do that.”

GSA started building USAi about 20 months ago, but its AI journey started more than a decade ago.

The agency took several important steps to get to the point where USAi could emerge.

Oversight process for use cases

Zach Whitman, GSA’s chief data scientist and chief AI officer, said at the recent 930gov conference, sponsored by the Digital Government Institute, that once generative AI became the next iteration of the capabilities, the agency established a generalized fair use policy for commercial offerings. He said that gave employees the ability to start to think about how they can start to use GenAI tools as part of their day-to-day workflows.

“One of the major things we understood was folks were afraid. They were afraid of how it would be perceived to use the tool as part of their work. They were afraid of data exfiltration problems. Am I putting the wrong thing in the tool? They were culturally afraid of being perceived as cheating or not necessarily doing the work and having an advantage. And there were a number of different hurdles that we were facing, some technical, some from the use case perspective, should I be doing this?” he said. “It became a clear mandate for us not only technically to figure this out, but also culturally. In the policies that we’ve been receiving from the White House, we’ve had a huge advantage.”

Whitman said GSA set up several gates for the AI tools to jump through before being offered widely across the agency.

GSA created its own evaluation factors based on the mission area as well as its own safety team made up of technical and subject matter experts.

Whitman said an executive team took all this feedback and made risk-based decisions about which tool could be used.

Through this process, Whitman said GSA saw a need for a common AI platform, which became USAi.

“What we did was we ended up building a platform using a lot of open source tooling. Again, nothing very original. It was all off-the-shelf concepts where you would take a chat bot interface and then you would end up building an API layer that interfaced across multiple hyperscalers, which allowed you access to a variety of different models,” he said. “You had a way to measure the behavior of the users on that system. Now the interesting piece of pulling a bunch of models together is it gives you a really good opportunity to compare models based on the use case. We understood that certain models were better than others, certain newer models were better than others, certain larger models were better than others. But that’s not always the case, and so what we ended up doing with the telemetry was less about monitoring the users and more about how we discern and make buying and business decisions based on the different models.”

Whitman added that GSA’s pilot and data showed the chatbot solves a majority of the use cases employees were interested in using AI for.

He said the chatbots met the basic standards of user experience and they didn’t require customization.

“What we are doing is making sure that we can have interoperability between the different models and maintaining the newest models as quickly as possible, so that the agency has access to these latest models, and then we run our evaluation set against them to understand which model is best suited for which purpose. And we’re doing it in a way where we try to convey to a subject matter expert, not an AI specialist, but a subject matter expert, to make a decision as to which model they need for their specific use case,” Whitman said.

Copyright
© 2025 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.





Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

AI Insights

Duke University pilot project examining pros and cons of using artificial intelligence in college – Independent Tribune

Published

on



Duke University pilot project examining pros and cons of using artificial intelligence in college  Independent Tribune



Source link

Continue Reading

AI Insights

Experiential learning: A solution to AI-driven challenges

Published

on


I was halfway into my sustainable agriculture lecture at UC Santa Barbara on an otherwise pleasant February afternoon when I heard the sound no teacher wants to hear: one of my students, in the back row, snoring. Loudly. I decided to plow ahead, even as other students turned around and erupted into giggles. Finally, someone shook the offending student awake, and class proceeded.

Later that week, a teaching assistant approached me to explain how bad the snorer felt about the incident. It wasn’t that the student was uninterested or found my lecture boring, the TA explained; they just struggled to stay awake through such a passive and sedentary experience. It wasn’t the content of my class that was the problem. It was the format.

The longer I’ve taught (this is my 11th year as a professor), the more I’ve leaned on experiential learning: hands-on activities that get students out of their seats and engaging all their senses and capacities. Even as universities in my state are signing deals with tech companies to bring free AI training to campus, I see students clamoring for something else: meaningful in-person experiences where they can make strong connections with mentors and peers.

As I’ve redesigned my classes to integrate more field trips to local farms, volunteer work with community organizations and hands-on lessons focused on building tangible skills, I’ve found that students work harder, learn more, and look forward to class. Instead of just showing slides of compost, I bring my students to our campus farm to harvest castings (nutrient-dense worm poop!) from the worm bins. Instead of just lecturing about how California farmers are adapting to water scarcity, I take students to visit a farm that operates without irrigation, where we help prune and harvest grapes and olives. Long wait lists for these types of classes indicate that demand is far greater than supply.

I’m a proponent of experiential learning in almost every educational context, but there are several reasons why it is particularly relevant and essential this school year.

For one thing, generative AI has upended most traditional assignments. We can no longer assume that writing submitted by students is indicative of what they’ve learned. As many of my colleagues have found out the hard way, students are routinely completely unfamiliar with the content of their own papers. In this environment, there’s a real advantage to directly supervising and assessing students’ learning, rather than relying on proxies that robots can fake.

As I’ve redesigned my classes to integrate more field trips to local farms, volunteer work with community organizations and hands-on lessons focused on building tangible skills, I’ve found that students work harder, learn more, and look forward to class.

Liz Carlisle

Second, today’s young adults face an uncertain economy and job market, partly due to AI. Many employers are deploying AI instead of hiring entry-level workers, or simply pausing hiring while waiting for markets to settle. As instructors, we must admit that we aren’t 100% sure which technical skills our students will need to succeed in this rapidly evolving workplace, especially five to 10 years down the road. Experiential learning has the advantage of helping students build the timeless, translatable skills that will AI-proof their employability: teamwork, communication, emotional intelligence and project management. As a bonus, community-engaged learning approaches can introduce students to professional settings in real time, ensuring a more up-to-date and relevant experience than any pre-cooked lesson plan.

Finally, and not unrelated to the above two points, Gen Z is experiencing a mental health crisis that inhibits many students’ ability to focus, set goals and develop self-confidence. There is nothing quite like putting a shovel and some seeds in their hands (preferably out of cellphone range) and watching them build a garden with their peers. The combined effect of being outdoors, digitally detoxing, moving about, bonding with others, and feeling a sense of accomplishment and making a difference is a powerful tonic for rumination and constant online isolation.

The field of environmental studies lends itself to outdoor experiential learning, and this has long been a key component of courses in ecology and earth science. But this approach can be quite powerful across the curriculum. I’ve known political science professors who take students to city council meetings, historians who walk students through the streets of their city to witness legacies of earlier eras, and writing instructors who bring groups of students to wild spaces to develop narrative essays on site.

With support from my department, I’m grateful to be able to teach an entirely experiential field course — but I’m equally excited about integrating modest experiential elements into my 216-person lecture course. Even one experiential assignment (like attending and reflecting on a public event) or hands-on activity in the discussion section can catalyze and deepen learning. 

To be sure, effective experiential learning is an art form that requires significant investment of time and energy from the instructor — and often from community partners as well. This work needs to be appropriately valued and compensated, and off-campus experiences require transportation funding and careful planning to ensure student safety. But the payoff can be the most meaningful and memorable experience of a student’s academic career. Instead of snoozing through a lecture, they can actively develop themselves into the adult they wish to become.

•••

Liz Carlisle is an associate professor of environmental studies at UC Santa Barbara and a Public Voices Fellow of the OpEd Project.

The opinions expressed in this commentary represent those of the author. EdSource welcomes commentaries representing diverse points of view. If you would like to submit a commentary, please review our guidelines and contact us.





Source link

Continue Reading

AI Insights

Skift hires Lee to cover travel tech and artificial intelligence

Published

on


Adriana Lee

Travel industry site Skift has hired Adriana Lee to cover travel technology and artificial intelligence.

Previously, she covered tech and its impact on fashion and beauty at Women’s Wear Daily, and what everyone from the consumer to the developer needs to know at multiple tech sites, including ReadWrite and Techno Buffalo ES.

Lee was also managing editor at Today’s iPhone.

She is a graduate of George Washington University.





Source link

Continue Reading

Trending