Connect with us

Tools & Platforms

Global Tech Organization to Hold AI Summit at UND

Published

on


(TNS) — The largest technical organization in the world has made the University of North Dakota the location for a summit focused on artificial intelligence and autonomous technology.

Innovation, Workforce and Research Conference summits — like the one coming to Grand Forks — are meant to shine a light on the locations they’re held at and what professionals are doing in those areas, said Melissa Carl, director of business development and career and member services at IEEE-USA.

“It’s a way for us to help put that spotlight on some really cool innovation that’s going on that maybe not the rest of the country would know about,” she said.


IEEE-USA, an organizational unit under the global IEEE (founded under the name Institute of Electrical and Electronics Engineers), is a technical professional organization for a number of technical professions, including engineers, scientists, computer scientists, software engineers, aeronautics engineers and others. The organization is holding the IWRC AI and Autonomy Summit at UND on Wednesday, Sept. 10, to bring together stakeholders, academia and government leaders to forge conversations about innovation in technology and science. The event will particularly put a spotlight on the region’s AI, aerospace and autonomous systems. It will be held in the Memorial Union from 8 a.m. to 4 p.m.

The IWRC events are held in small, localized and rural areas, said Leah Laird, senior marketing and communications specialist at IEEE-USA. The summits focus on driving innovation and workforce sectors of the area and bringing together lead stakeholders from academia and the government.

“(It gets) conversations and really important discourse going, so that hopefully that can have an impact on the local area when it comes to the U.S.’s stance and innovation, when it comes to technology and science,” she said.

The event coincides with North Dakota Autonomy Week, proclaimed by Gov. Kelly Armstrong. The summit and other events across the state were mentioned in the proclamation, which named North Dakota as a national leader in autonomous systems development and deployment. The proclamation also named UND as a cutting-edge research institution working alongside other industry leaders and public-private partnerships to responsibly develop AI, robotics and autonomy.

“Autonomy Week celebrates the innovators, educators, entrepreneurs, farms and public servants working to ensure that intelligent and autonomous systems benefit all communities across our state,” the proclamation said.

Being a leader in AI is one of the big goals for UND this year. During his yearly State of the University address, President Andrew Armacost said the university aims to become the AI university for North Dakota this academic year.

In past comments, Armacost said UND as a campus is focusing on growth in AI and unification around it. The university is navigating how to prepare students in every discipline to be successful in an AI-enabled world, both in their academic disciplines and their careers.

“It’s not the blind adoption of new tools,” he said. “It’s the critical inquiry about what the societal impacts are on AI, what the governance pieces and the legal pieces and frameworks are that we need to understand.”

Armacost will have a fireside chat with Program Manager Phillip Smith from the Tactical Technology Office of the Defense Advanced Research Projects Agency at the summit during a networking lunch. Smith is one of the number of guests to the event, including IEEE-USA members, UND faculty and staff, business leaders and others in the region visiting through the day’s events. Mayor Brandon Bochenski will also be delivering remarks during the summit.

IEEE-USA will also be holding a conference in October at UND. Carl said it will be more of a traditional IEEE technical conference.

The summit’s full agenda and registration is available at https://iwrc.ieeeusa.org/iwrc-grand-forks/.

© 2025 the Grand Forks Herald (Grand Forks, N.D.). Distributed by Tribune Content Agency, LLC.





Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tools & Platforms

Driving the Way to Safer and Smarter Cars

Published

on


A new, scalable neural processing technology based on co-designed hardware and software IP for customized, heterogeneous SoCs.

As autonomous vehicles have only begun to appear on limited public roads, it has become clear that achieving widespread adoption will take longer than early predictions suggested. With Level 3 systems in place, the road ahead leads to full autonomy and Level 5 self-driving. However, it’s going to be a long climb. Much of the technology that got the industry to Level 3 will not scale in all the needed dimensions—performance, memory usage, interconnect, chip area, and power consumption.

This paper looks at the challenges waiting down the road, including increasing AI operations while decreasing power consumption in realizable solutions. It introduces a new, scalable neural processing technology based on co-designed hardware and software IP for customized, heterogeneous SoCs that can help solve them.

Read more here.



Source link

Continue Reading

Tools & Platforms

Tech companies are stealing our books, music and films for AI. It’s brazen theft and must be stopped | Anna Funder and Julia Powles

Published

on


Today’s large-scale AI systems are founded on what appears to be an extraordinarily brazen criminal enterprise: the wholesale, unauthorised appropriation of every available book, work of art and piece of performance that can be rendered digital.

In the scheme of global harms committed by the tech bros – the undermining of democracies, the decimation of privacy, the open gauntlet to scams and abuse – stealing one Australian author’s life’s work and ruining their livelihood is a peccadillo.

But stealing all Australian books, music, films, plays and art as AI fodder is a monumental crime against all Australians, as readers, listeners, thinkers, innovators, creators and citizens of a sovereign nation.

The tech companies are operating as imperialists, scouring foreign lands whose resources they can plunder. Brazenly. Without consent. Without attribution. Without redress. These resources are the products of our minds and humanity. They are our culture, the archives of our collective imagination.

If we don’t refuse and resist, not just our culture but our democracy will be irrevocably diminished. Australia will lose the wondrous, astonishing, illuminating outputs of human creative toil that delight us by exploring who we are and what we can be. We won’t know ourselves any more. The rule of law will be rendered dust. Colony indeed.

Tech companies have valorised the ethos “move fast and break things”, in this case, the law and all it binds. To “train” AI, they started by “scraping” the internet for publicly available text, a lot of which is rubbish. They quickly realised that to get high-quality writing, thinking and words they would have to steal our books. Books, as everyone knows, are property. They are written, often over years, licensed for production to publishers and the rental returns to authors are called royalties. No one will write them if they can be immediately stolen.

Copyright law rightfully has its critics, but its core protections have enabled the flourishing of book creation and the book business, and the wide (free but not “for free”) transmission of ideas. Australian law says you can quote a limited amount from a book, which must be attributed (otherwise it’s plagiarism). You cannot take a book, copy it entirely and become its distributor. That is illegal. If you did, the author and the publisher would take you to court.

Yet what is categorically disallowed for humans is being seriously discussed as acceptable for the handful of humans behind AI companies and their (not yet profit-making) machines.

To the extent they care, tech companies try to argue the efficiency or necessity of this theft rather than having to negotiate consent, attribution, appropriate treatment and a fee, as copyright and moral rights require. No kidding. If you are setting up a business, in farming or mining or manufacturing or AI, it will indeed be more efficient if you can just steal what you need – land, the buildings someone else constructed, the perfectly imperfect ideas honed and nourished through dedicated labour, the four corners of a book that ate a decade.

Under the banner of progress, innovation and, most recently, productivity, the tech industry’s defence distils to “we stole because we could, but also because we had to”. This is audacious and scandalous, but it is not surprising. What is surprising is the credulity and contortions of Australia’s political class in seriously considering retrospectively legitimising this flagrantly unlawful behaviour.

The Productivity Commission’s proposal for legalising this theft is called “text and data mining” or TDM. Socialised early in the AI debate by a small group of tech lobbyists, the open secret about TDM is that even its proponents considered it was an absolute long shot and would not be taken seriously by Australian policymakers.

Devised as a mechanism primarily to support research over large volumes of information, TDM is entirely ill-suited to the context of unlawful appropriation of copyright works for commercial AI development. Especially when it puts at risk the 5.9% of Australia’s workforce in creative industries and, speaking of productivity, the $160bn national contribution they generate. The net effect if adopted would be that the tech companies can continue to take our property without consent or payment, but additionally without the threat of legal action for breaking the law.

Let’s look at just who the Productivity Commission would like to give this huge free-kick to.

Big Tech’s first fortunes were made by stealing our personal information, click by click. Now our emails can be read, our conversations eavesdropped on, our whereabouts and spending patterns tracked, our attention frayed, our dopamine manipulated, our fears magnified, our children harmed, our hopes and dreams plundered and monetised.

The values of the tech titans are not only undemocratic, they are inhumane. Mark Zuckerberg’s empathy atrophied as his algorithm expanded. He has said, “A squirrel dying in front of your house may be more relevant to you right now than people dying in Africa.” He now openly advocates “a culture that celebrates aggression” and for even more “masculine energy” in the workplace. Eric Schmidt, former head of Google, has said, “We don’t need you to type at all. We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.”

The craven, toadying, data-thieving, unaccountable broligarchs we saw lined up on inauguration day in the US have laid claim to our personal information, which they use for profit, for power and for control. They have amply demonstrated that they do not have the flourishing of humans and their democracies at heart.

And now, to make their second tranche of fortunes under the guise of AI, this sector has stolen our work.

Our government should not legalise this outrageous theft. It would be the end of creative writing, journalism, long-form nonfiction and essays, music, screen and theatre writing in Australia. Why would you work if your work can be stolen, degraded, stripped of your association, and made instantly and universally available for free? It will be the end of Australian publishing, a $2bn industry. And it will be the end of us knowing ourselves by knowing our own stories.

Copyright is in the sights of the technology firms because it squarely protects Australian creators and our national engine of cultural production, innovation and enterprise. We should not create tech-specific regulation to give it away to this industry – local or overseas – for free, and for no discernible benefit to the nation.

The rub for the government is that much of the mistreatment of Australian creators involves acts outside Australia. But this is all the more reason to reinforce copyright protection at home. We aren’t satisfied with “what happens overseas stays overseas” in any other context – whether we’re talking about cars or pharmaceuticals or modern slavery. Nor should we be when it comes to copyright.

Over the last quarter-century, tech firms have honed the art of win-win legal exceptionalism. Text and data mining is a win if it becomes law, but it’s a win even if it doesn’t – because the debate itself has very effectively diverted attention, lowered expectations, exhausted creators, drained already meagerly resourced representatives and, above all, delayed copyright enforcement in a case of flagrant abuse.

So what should the government do? It should strategise, not surrender. It should insist that any AI product made available to Australian consumers demonstrate compliance with our copyright and moral rights regime. It should require the deletion of stolen work from AI offerings. And it should demand the negotiation of proper – not token or partial – consent and payment to creators. This is a battle for the mind and soul of our nation – let’s imagine and create a future worth having.

Anna Funder is the author of the prize-winning international bestsellers Stasiland, All That I Am and Wifedom: Mrs Orwell’s Invisible Life. Julia Powles is a law professor and executive director of the Institute for Technology, Law & Policy at the University of California Los Angeles and former contributing editor and policy fellow at The Guardian



Source link

Continue Reading

Tools & Platforms

AI-related court cases surge in Beijing

Published

on


A Beijing court has observed an increasing number of cases related to artificial intelligence in recent years, highlighting the need for collaborative efforts to strengthen oversight in the development and application of this advanced technology.

Since the Beijing Internet Court was established in September 2018, it has concluded more than 245,000 lawsuits. “Among them, cases involving AI have been growing rapidly, primarily focusing on issues such as the ownership of copyright for AI-generated works and whether the use of AI-powered products or services constitutes infringement,” Zhao Changxin, vice-president of the court, said on Wednesday.

He told a news conference that as AI empowers more industries, disputes involving this technology are no longer limited to the internet sector but are now widely permeating fields including culture, entertainment, finance, and advertising.

“The fast development of the technology has not only introduced new products and services, but also brought about new legal risks such as AI hallucinations and algorithmic problems,” he said, adding judicial decisions should seek a balance between encouraging technological innovation and upholding social ethics.

In the handling of AI-related disputes, he emphasized that the priority needs to be given to safeguarding individual dignity and rights. For example, the court last year issued a landmark ruling that imitating someone”s voice through AI without their permission constitutes an infringement of their personal rights.

He suggested that internet users enhance legal awareness, urging technology developers to strictly abide by laws to ensure the legality of their data sources and the foundational modes of origin.

Meanwhile, he said that AI service providers should fulfill their information security obligation by promptly taking measures to halt the generation, transmission, and elimination of illegal content, and make necessary corrections.

In addition, he called on judicial bodies to work with other authorities, including those on cyberspace management, market regulation, and public security, to tighten the supervision in AI application, drawing clear boundaries of responsibilities and duties for the technology developers and service providers.



Source link

Continue Reading

Trending