Our 2 year goal, by the end of AY 24-25, is to produce GAI-enabled course structure across the introductory STEM curriculum.
The tools should be agnostic to the textbook(s) used in the course, and should be implementable with a minimal time investment in courses nationwide.
This effort will take place in stages, as follows:
Summer 2023 - establish structure for the process, secure resources, build team, delineate clear goals for the Fall 2023 term, define a subset of experiments for Fall 2023. Initial tool development for API-interface exploitation.
Develop an experimental active learning lecture series that incorporates GPT-4 capabilities and measure its effectiveness in student comprehension and retention
Fall 2023 - Make all course GAI-aware. Conduct initial set of 10 GAIPedagogy experiments across the STEM curriculum. Tracking of pedagogical experiments. First draft of experimental-results paper. Oct 2023 workshop for Harvard STEM faculty.
January 2024- Three day national Workshop on STEM GAI pedagogy. First draft of experimental-results paperDevelopment sprint for GAI active learning and HW tools.
Spring 2023 - General Education course offered. Rollout of prototype active learning and HW modules in 4 courses, with assessments.
Summer 2024 - Extension of prototype active learning and HW modules to all of intro STEM curriculum, with assessments.
Fall 2024 - First offering of GAI-empowered courses across entire introductory STEM curriculum
Spring 20252025
Summer 2025
Fall term 2023
...
Item | experiments | courses | lead | GPT aspect needed | Validation criteria | |
---|---|---|---|---|---|---|
Lecture components: active learning methodology that leverages GAI | Incorporation into lecture-format learning:
|
| ||||
Develop and Exploit short-cycle adaptive problem sets with real-time feedback | ||||||
Assist with analysis and gain insights from lab data. | ||||||
interactive student self-assessments. | ||||||
In-class group consultation with ChatGPT | ||||||
capturing and submitting work for evaluation by course staff | ||||||
automated evaluation of understanding of material, by evaluating answers to questions we provide. | ||||||
try out a non-analytic problem and assess the results. |
| 15 a,b,c | numerical solution | |||
include ability to perform calculations, as pioneered by Khan Academy | ||||||
incorporate course-specific training inputs and give that high weighting | custom training inputs | |||||
Automation of grading and assessments of student competence. | sequential prompts run open loop, no adjustment | |||||
dynamic tutoring | sequential prompts with iterative adjustment | |||||
Generation and refinement of course instructional and assessment materials- HW, exams, etc. |
|
...
- Assess our assessments: run midterm and final exams of science courses through GPT-4 and grade the results. Compare to overall student performance.
- Enhance our assessments. Solicit constructive feedback on the exam questions we submit.
- Assess our homework: run homework assignments through GPT-4 and grade the results. Compare to overall student performance.
- Enhance our assignments. Solicit constructive feedback on the homework we submit.
- Ask (require?) students to use GPT-4 on selected assignments to get feedback and examples of how it can be used.
- Course-specific chat-bots- what training data?
- For large lecture classes- merge active learning with GPT
- For sections- aggregation of questions,
- For labs- try out data analysis methods and inference
- Customized training assembly of material - what do we need to start to capture?
- Khan academy like adaptive tutorials
- How does this shift the workload in our non-ladder teaching capacity? Especially sections and TFs and grading?
Student-centric viewpoint
- Learning how to craft a prompt that gets what you want
- GAI as a consultant
- GAI for self-assessment
- iterative refinements in GAI interactions
- Learning how to validate and verify results
- Honing critical thinking skills in the GAI context.
- Ethical, responsible, thoughtful use of powerful tools.
- Accommodating disabilities and ensuring equitable access.
IT-centric viewpoint
- How do we integrate these learning tools with existing platforms? Examples include Canvas, grade sheets, Sharepoint, Jupiter notebooks, data repositories, assignments, work-uploading tools, etc?
- What is the best approach to licensing and token-purchasing?
- How do we throttle and regulate non-course abuse?
- How do we develop, curate, and support the use of this new toolkit?
- What are institutional roles, responsibilities, accountabilities, and authorities?
- What staffing is needed, at what levels of the organization?
Specific examples :
Modulated Friction example
...
GAISTEM core team, weekly meetings
C. Stubbs
L. McCarty
G. Kestin
GAISTEM interest stakeholder group, monthly meetings
departmentstakeholder | ||
---|---|---|
physics | Matt Schwartz Louis Deslauriers | |
statistics | Xiao-Li Meng Lucas Janson | |
EPS | Brandon Meade | |
MCB | Sean Eddy | |
OEB | Michael Desai | |
HEB | ? | |
SCRB | ? | |
CCB | ? | |
Astronomy | Doug Finkbeiner | |
SEAS | Martin Wattenberg | |
Math | Cliff Taubes? | |
College OUE | Amanda Claybaugh Anne Harrington | |
Harvard College | Rakesh Khurana | |
Bok Center | Adam Beaver | |
HGSE | ||
humanities div. | Robin Kelsey Jeffrey Schnapps | |
social sci div. | ||
VPAL | Bharat Anand | |
HUIT | Klara Jelinkova |
Links
https://bokcenter.harvard.edu/artificial-intelligence Bok center AI page
...