• Skip to main content
  • Skip to primary navigation
  • Departments
    • Bioengineering
    • Civil and Environmental Engineering
    • Electrical Engineering and Computer Sciences
    • Industrial Engineering and Operations Research
    • Materials Science and Engineering
    • Mechanical Engineering
    • Nuclear Engineering
    • Aerospace program
    • Engineering Science program
  • News
    • Berkeley Engineer magazine
    • Social media
    • News videos
    • News digest (email)
    • Press kit
  • Events
    • Cal Day
    • Commencement
    • Events calendar
    • Engineering Ethics workshop
    • Homecoming
    • Kuh Lecture Series
    • Minner Lecture
    • Space reservations
    • View from the Top
  • College directory
  • For staff & faculty
Berkeley Engineering

Educating leaders. Creating knowledge. Serving society.

  • About
    • Facts & figures
    • Rankings
    • Mission & values
    • Equity & inclusion
    • Voices of Berkeley Engineering
    • Leadership team
    • Milestones
    • Buildings & facilities
    • Maps
  • Admissions
    • Undergraduate admissions
    • Graduate admissions
    • New students
    • Visit
    • Maps
    • Admissions events
    • K-12 outreach
  • Academics
    • Undergraduate programs
    • Majors & minors
    • Undergraduate Guide
    • Graduate programs
    • Graduate Guide
    • Innovation & entrepreneurship
    • Kresge Engineering Library
    • International programs
    • Executive education
  • Students
    • New students
    • Advising & counseling
    • ESS programs
    • CAEE academic support
    • Grad student services
    • Student life
    • Wellness & inclusion
    • Undergraduate Guide
    • > Degree requirements
    • > Policies & procedures
    • Forms & petitions
    • Resources
  • Research & faculty
    • Centers & institutes
    • Undergrad research
    • Faculty
    • Sustainability and resiliency
  • Connect
    • Alumni
    • Industry
    • Give
    • Stay in touch
Home > Events > Engineering Ethics workshop

Engineering Ethics workshop

Co-sponsored by the College of Engineering and Enable Tech student group

College of engineering logo
Enabletech logo

Thank you to our industry partners

Details & registration

Free lunches for attendees

Prizes for winners of poster contest

Prizes for breakout session winners

Its-Its Ice cream sandwiches for all participants


Date: April 16, 2026

Time: 10:00 – 3:00

Location: Grimes Engineering Center

Open to: All Faculty, Undergraduate Students, Alumni interested in discussing Engineering Ethics & Social Responsibility across UCB

Deadline to register for the workshop: April 9, 2026

Register for workshop

Poster contest

Present a poster about how your student organization engages with engineering ethics, or choose one of the provided poster prompts.

Register to submit a poster by: April 2, 2026

Posters must be submitted by: April 13, 2026

Poster guidelines Register to submit a poster

Breakout sessions with industry partners

Work in groups of 5-6 students and design solutions to case studies provided by our industry panelists.

Prizes to be given to groups with the most compelling ideas/solutions.

Students, please come with resume ready to share electronically with industry partners who are interested in connecting regarding jobs and internships.

Agenda

Time: 9:15 – 10:00
Location: Raw Thrills Lounge
Event: Check in, Pastries, Coffee


Time: 10:00 – 10:45
Location: Jarvis Auditorium
Event: Industry Panel with Nvidia & Lam Research

Short presentations by representatives.
How do we enable rapid innovation while thinking about impact to society? (which could include sustainability, confidentiality, individual protections). Possible questions:

  • How are potential societal impacts identified and communicated within the company, and within the industry?
    Are there examples from your field of differing global norms, and does the company take initiatives to seek global uniformity, or does the variability create unique opportunities?
  • What are the roles of regulations? What do you wish the roles of relations were? Which are the regulatory agencies of relevance to your industry, and how do you envision the regulatory space changing? In your field, does regulation help, or hinder?  
  • What engineering ethics and social responsibility training does your company provide, or expect of engineers to have before they are hired?

    Q&A with Students

Time: 11:00- 12:30
Location: Brayton, Greene & Rosenblum
Event: Students break into teams (5-6 students) and create a solution to the Case Study. 

Nvidia Case Study

Trusting What You Cannot Fully Test

  • Background: General purpose AI models, the kind that can write code, interpret images, hold conversations, guide robots, assist doctors, and make recommendations, are being deployed faster than we can evaluate them. Unlike a traditional software system where every function can be traced and tested, these models emerge from training on vast amounts of data, producing capabilities that even their creators did not explicitly program and sometimes cannot fully explain. A hospital deploys one to help triage patients. A logistics company integrates one into a fleet of humanoid warehouse robots. A school district uses one to flag at-risk students. In each case, the people deploying the model understand it only partially, and the resources to test every possible real-world scenario simply do not exist.
  • The question: Propose a technical solution, policy framework, or combination of both that would allow an organization to responsibly deploy a general purpose AI model in a high-stakes environment, even when a complete understanding of its capabilities and a comprehensive test of every possible scenario are not achievable.
  • Evaluation Criteria:
    • Risk identification. Does the response correctly identify the specific failure modes and harms that could arise in the chosen deployment context
    • User behaviour profiling. Does the response propose a method for capturing how different types of users interact with the system
    • Engagement with uncertainty. Does the response propose a mechanism that functions responsibly at the boundary of what can be tested, rather than one that assumes complete coverage is achievable?
    • Specificity. Is the proposed solution concrete enough to be actionable, rather than a restatement of general safety principles?
    • Trade-off transparency. Does the response acknowledge at least one meaningful limitation of its own solution and justify why that cost is acceptable?

LAM Case Study

Designing for Performance, Profit, and Planet

  • You are early-career engineers at Lam Research, a global leader in plasma etch, deposition, and clean equipment used by TSMC, Samsung, Intel, and other leading fabs. Advanced semiconductor fabrication is extraordinarily resource-intensive: a single fab consumes as much electricity as a small city, uses millions of gallons of ultrapure water daily, and relies on fluorinated gases (CF₄, SF₆, NF₃) with global warming potentials 10,000–25,000× that of CO₂.
  • Historically, customers bought equipment based on safety/compliance, on-wafer performance (etch rate, uniformity, selectivity, defectivity), and cost of ownership—CoO (throughput, uptime, consumables). Sustainability was not a purchase factor.
  • That is changing. Customers have made net-zero pledges (2030–2050). Regulations in the EU, Korea, and Taiwan are tightening. Downstream brands demand Scope 3 transparency. Customers now ask suppliers for per-wafer energy reductions, lower gas usage, reduced water/chemical consumption, and lifecycle carbon data. Sustainability is becoming a selection criterion.

Case study full details linked here

Case study supplementals linked here


Time: 12:30 – 1:30
Location: Raw Thrills & Giancarlo Gallery
Event: Lunch & Poster Sessions


Time: 1:30 – 2:00
Location: Jarvis Auditorium
Event: Presentation of winners in breakout room Sessions
Poster Session Winners


Time: 2:00 – 3:00
Location: Greene & Rosenblum, Brayton
Event: Industry Engagement
Meet with companies to learn about opportunities & employment with Nvidia and Lam Research

Time

Location

Event

9:15 – 10:00

Raw Thrills Lounge

Check in, Pastries, Coffee

10:00 – 10:45

Jarvis Auditorium

Industry Panel with Nvidia and Lam Research

Short presentations by representatives.
How do we enable rapid innovation while thinking about impact to society? (which could include sustainability, confidentiality, individual protections). Possible questions:

  • How are potential societal impacts identified and communicated within the company, and within the industry?
  • Are there examples from your field of differing global norms, and does the company take initiatives to seek global uniformity, or does the variability create unique opportunities?
  • What are the roles of regulations? What do you wish the roles of relations were? Which are the regulatory agencies of relevance to your industry, and how do you envision the regulatory space changing? In your field, does regulation help, or hinder?
  • What engineering ethics and social responsibility training does your company provide, or expect of engineers to have before they are hired?

Q&A with Students

11:00- 12:30

Brayton

Greene & Rosenblum

Students break into teams (5-6 students) and create a solution to the Case Study. Breakout Sessions: 

Nvidia Case Study

Trusting What You Cannot Fully Test

  • Background: General purpose AI models, the kind that can write code, interpret images, hold conversations, guide robots, assist doctors, and make recommendations, are being deployed faster than we can evaluate them. Unlike a traditional software system where every function can be traced and tested, these models emerge from training on vast amounts of data, producing capabilities that even their creators did not explicitly program and sometimes cannot fully explain. A hospital deploys one to help triage patients. A logistics company integrates one into a fleet of humanoid warehouse robots. A school district uses one to flag at-risk students. In each case, the people deploying the model understand it only partially, and the resources to test every possible real-world scenario simply do not exist.
  • The question: Propose a technical solution, policy framework, or combination of both that would allow an organization to responsibly deploy a general purpose AI model in a high-stakes environment, even when a complete understanding of its capabilities and a comprehensive test of every possible scenario are not achievable.
  • Evaluation Criteria:
    • Risk identification. Does the response correctly identify the specific failure modes and harms that could arise in the chosen deployment context
    • User behaviour profiling. Does the response propose a method for capturing how different types of users interact with the system
    • Engagement with uncertainty. Does the response propose a mechanism that functions responsibly at the boundary of what can be tested, rather than one that assumes complete coverage is achievable?
    • Specificity. Is the proposed solution concrete enough to be actionable, rather than a restatement of general safety principles?
    • Trade-off transparency. Does the response acknowledge at least one meaningful limitation of its own solution and justify why that cost is acceptable?

LAM Case Study

Designing for Performance, Profit, and Planet

  • You are early-career engineers at Lam Research, a global leader in plasma etch, deposition, and clean equipment used by TSMC, Samsung, Intel, and other leading fabs. Advanced semiconductor fabrication is extraordinarily resource-intensive: a single fab consumes as much electricity as a small city, uses millions of gallons of ultrapure water daily, and relies on fluorinated gases (CF₄, SF₆, NF₃) with global warming potentials 10,000–25,000× that of CO₂.
  • Historically, customers bought equipment based on safety/compliance, on-wafer performance (etch rate, uniformity, selectivity, defectivity), and cost of ownership—CoO (throughput, uptime, consumables). Sustainability was not a purchase factor.
  • That is changing. Customers have made net-zero pledges (2030–2050). Regulations in the EU, Korea, and Taiwan are tightening. Downstream brands demand Scope 3 transparency. Customers now ask suppliers for per-wafer energy reductions, lower gas usage, reduced water/chemical consumption, and lifecycle carbon data. Sustainability is becoming a selection criterion.

Case study full details linked here

Case study supplementals linked here

12:30 – 1:30

Raw Thrills & Giancarlo Gallery

Lunch & Poster Sessions

1:30 – 2:00

Jarvis Auditorium

Presentation of winners in breakout room sessions

Poster Session Winners

2:00 – 3:00

Jarvis Auditorium

Brayton

Greene & Rosenblum

Industry Engagement: Meet with companies to learn about opportunities & employment

Nvidia

Lam Research

Speakers

Dr. Wojciech Osowiecki

Dr. Wojciech (“Wojtek” – pronounced “Voytek”) Osowiecki is a Product Marketing Engineer in the Global Product Group at Lam Research. He is responsible for the environmental sustainability roadmaps of all etch, deposition, and clean products, and the Equipment Intelligence® feature of etch tools. Wojtek founded the Lam Employee Sustainability Community (LESC), an employee resource group focused on environmental sustainability with over 1,100 members, and co-chairs SEMI’s Climate Equity and Social Impact (CESI) working group. A Siebel Scholar and winner of UC Berkeley’s Cleantech to Market program, Wojtek holds a Ph.D. in physical chemistry from UC Berkeley and a joint B.S./M.S. degree in chemistry from Yale.

Barnaby Simkin

Barnaby is Director of Trustworthy AI at NVIDIA. He designs the internal governance structures that guide responsible AI development and deployment. He leads a team that creates processes and evaluation tools to quantify and mitigate risk across models, datasets, and complex integrated systems such as autonomous vehicles and humanoid robots. His work ensures AI is not only technically capable, but safe, compliant, and aligned with legal and ethical expectations.

  • Contact
  • Give
  • Privacy
  • UC Berkeley
  • Accessibility
  • Nondiscrimination
  • instagram
  • X logo
  • linkedin
  • facebook
  • youtube
  • bluesky
© 2026 UC Regents