New Pittsburgh Initiative Aims to Raise Awareness of AI Ethics

PGH.AI's Kenny Chen speaks to the crowd at the group's launch event. Image: PGH.AI

July 18, 2019      
Ray Linsenmayer

PITTSBURGH – A new initiative designed to bring connectivity, accessibility, and transparency to the Pittsburgh-area artificial intelligence ecosystem recently launched. PGH.AI’s goal is to bring “the necessary set of conditions to proactively develop AI solutions that can be scaled and replicated around the globe,” said Kenny Chen, the founder of the initiative.

Developing AI ethically and sustainably will require the collective input of a wide array of organizations and individuals. Luckily, as a relatively small town, Pittsburgh is already a place where there is a good deal of collaboration. Chen said he has lived all over the world, and Pittsburgh is “the most welcoming place I’ve ever been” in terms of collaboration. Mike Capsambelis, a product manager at Google Pittsburgh, agreed: collaboration “is something Pittsburgh is good at and known for.”

PGH.AI is the offshoot of two organizations – the Pittsburgh-based Partnership to Advance Responsible Technology (PART), a think tank working to implement the U.N.’s AI for Good initiative in Pittsburgh, in which Chen is the executive director; and Brussels-based City.AI, which supports the creation of local AI hubs in 70 cities around the world to help motivate people to self-organize and find ways to scale and invest sustainably in AI activities.

Awareness challenges

One of the group’s challenges is that so much of the research and development in AI has been centered around universities. “Where it’s been commercialized, it’s been in corporate R&D labs and startups,” Chen said. Anything that happens “outside a particularly industry vertical or academic discipline is often challenging for people to build awareness around. For people with non-technical backgrounds, including Pittsburgh’s large non-profit community, it can be even more challenging, Chen said.

Kenny Chen PGH.AI article

Kenny Chen, PGH.AI

The group already has an impressive list of partners, including Bosch, which opened its global headquarters for AI research in the Steel City last year; and Google, which recently hired Andrew Moore of from Carnegie Mellon University to head Google Cloud AI. Additional partners include Remake Learning, which invests in K-12 education to prepare students for the workforce of the future, and Ascender, a Pittsburgh incubator, where Chen developed the framework for PGH.AI while he was (until recently) Innovation Director.

AI ethics is central to the mission. “The responsible development of AI and adjacent data-driven technologies is a core value” for PGH.AI, asserts Chen. “While that doesn’t mean every single activity or program has to explicitly address ethics … we’re very insistent on maintaining a focus on ethics at each turn and staying vigilant” around things like “bias issues, safety risks, data privacy, and security.”

PGH.AI’s goal, Chen said, is to create an “accessible community that the Pittsburgh region feels a shared sense of ownership around.” Capsambelis added that this community, “which includes a network of AI experts, practitioners, [and] newbies, will have ethics as a front-and-center concern as we share what we’re working on and talk about some of the innovation in the space. I know it’s really important to Google.”

Chen said PGH.AI will have an open door for individuals, companies, or other organizations to propose ideas for projects, and that the group would be happy to work with anyone “as long as their values align with [our] mission.”

Ani Martinez PGH.AI ambassador

Ani Martinez, Remake Learning

Ani Martinez, field director for Remake Learning and a PGH.AI Ambassador, said her nonprofit “considers the rise of AI to be particularly profound because of the deep impacts it has in not only how we measure learning,” but also how it affects “our everyday lives.” She said her group wants to “ensure that children and educators have a seat at the table”, so better decisions can be made about “how those technologies are applied.”

PGH.AI is working with several regional organizations that have developed their own frameworks around ethical AI. This includes Alpha Lab Gear and their ethical product development requirement for its portfolio companies, various structures at CMU including the Block Center, K&L Gates’ Center for Ethics in Computational Technologies, and several initiatives at the University of Pittsburgh, including Pitt Cyber.

A main issue, said Martinez, is the “perennial diversity question of how these technologies develop,” and how to ensure “that more diverse data and humans,” specifically in terms of race, physical ability, and gender, “can be a better and more independent part of this process.”

Chen said he is realistic about the impact that PGH.AI can have on the many larger (especially autonomous vehicle) companies already in the region. While direct influence is outside the initiative’s scope, PGH.AI as a convener of discussions can ensure that more decision-makers and the general public are aware of the issues. “My hope is that this will move the dial” with some of them, he said.