
AI Summary
→ WHAT IT COVERS How music composition techniques, AI-generated music tools, and human voice perception shape our emotional responses and daily experiences through deliberate sound choices. → KEY QUESTIONS ANSWERED - How do songwriters use melody and harmony to manipulate emotions? - What role will AI play in future music creation? - Why do we dislike hearing our own recorded voice? → KEY TOPICS DISCUSSED - Songwriting Craft: Scarlett Keyes explains how composers use octave leaps, key changes, and prosody to create emotional hooks, citing examples from Wicked and Adele songs. - AI Music Generation: Pierre Barraud demonstrates how tools like Ava create personalized soundtracks by analyzing classical scores and generating compositions for specific moods and contexts. → NOTABLE MOMENT Rebecca Kleinberger reveals that people actually prefer their own voice when they hear recordings without knowing it belongs to them, contradicting our typical negative reactions. 💼 SPONSORS [{"name": "Recorded Future", "url": null}, {"name": "AT&T", "url": null}, {"name": "Superhuman", "url": "superhuman.com/podcast"}, {"name": "Carvana", "url": "carvana.com"}, {"name": "US Bank", "url": "usbank.com"}, {"name": "Zoom", "url": "zoom.com/podcast"}] 🏷️ Music Composition, AI Music Generation, Voice Perception, Sound Psychology