Wealth Matters 3.0
THE ATOMIQ LEVEL
EP014 ATOMIQ LEVEL-Threat Casting the Future | AI, Applied Futurism, and Human-Centered Design
0:00
-1:55:01

EP014 ATOMIQ LEVEL-Threat Casting the Future | AI, Applied Futurism, and Human-Centered Design

Host: Chris J Snook Featuring Cyndi Coon, Applied Futurist

Recorded on Mar 10, 2026

This episode sits at the crossroads of artificial intelligence, future forecasting, and human-centered design, but its real power is in how practical it becomes. The conversation explores what it means to be an applied futurist: someone who does not just predict trends, but combines social science, human behavior, and creative imagination to build prototypes, test scenarios, and shape better outcomes in the present.

At the center of the episode is the powerful concept of threat casting, a method for building a detailed human persona and then stress-testing that person’s world through adversarial scenarios to expose risks, blind spots, and opportunities before they become real.

HNWIs, advisors, business owners, and parents of teenage children need to hear this episode because the stakes are no longer theoretical. AI is changing how people work, relate, decide, protect assets, and prepare the next generation. This conversation offers a smarter way to think ahead. It helps affluent families and advisors anticipate emerging risks, gives business owners a framework for resilience and strategic design, and gives parents a lens for understanding the world their children are growing into. It is not fear-driven futurism.

It is insight-packed preparation for people who want to stay human, adaptive, and ahead of the curve.

Connect with Cyndi Coon

Find all her sites via Linktree https://linktr.ee/cyndicoon

Watch the Video Interview (45min sample only)

Photocredit: Advanced AI Society-Human Authorized AI Summit on Human Agency in Napa, CA

EPISODE 14 Summary Notes

Applied futurist Cyndi Coon detailed her human-centric work with generative AI, discussing ASU’s innovative culture and the threat casting methodology.

Applied Futurist Defined
Cyndi Coon described her applied futurist role, which combines social science and creative practice to produce functional prototypes and prototypes for live worldbuilders. Threat casting, a sister methodology to future casting, involves building effects-based models to examine how something affects a human being.

Innovation and Autonomy at ASU
Cyndi Coon explained that Arizona State University’s consistent innovation is driven by the almost absolute autonomy granted to staff and the mandatory requirement to break silos in performance reviews. This culture fosters cross-pollination across disciplines, as exemplified by the ‘Hacks for Humanity’ event.

Human Agency and Future Assets
The discussion centered on human agency, noting that the rise of AI presents an opportunity to redefine being human and focus on human connection. The future of income will involve data sovereignty, where individuals own and potentially monetize their personal data, establishing Name, Image, and Likeness as a sovereign asset.

Details

  • Introduction of Applied Futurist Cyndi Coon: Chris J Snook welcomed Cyndi Coon to the show, noting that they had recently attended the Advanced AI and Linux Foundation’s Summit on Human Agency in Napa Valley. The discussion focused on Cyndi Coon’s background as an applied futurist, her current work, and her previous experiences (00:00:00). Cyndi Coon noted her focus on the human-centric piece of artificial intelligence, particularly generative AI, since about 2017 (00:01:09).

  • The Threat Casting Lab and Human Focus: Cyndi Coon stated that her interest in AI is driven by human elements, directing her practice toward how technology affects people. She started looking at generative AI in 2017 at the threat casting lab, which she co-created. Chris J Snook asked for context on the lab, its client base, and its evolution from prior work (00:01:09).

  • Cyndi Coon’s Background and Career Path: Cyndi Coon’s background started in Western Michigan, where they were trained as a creative in art school. They earned a master’s degree at Arizona State University (ASU) in social practice, which involves building creative structures around social issues to elevate them into the cultural zeitgeist (00:02:27). This led to a decade-long paper goods manufacturing company, followed by a pivot into producing large-scale human events and serving as an experience producer for ASU President Michael Crowe for about 14 to 15 years (00:03:45).

  • Transition to Futures Work and Institution Building: Cyndi Coon’s company, Laboratory 5, became the entity leading projects, where they were hired as a “producer for hire” to realize concepts, not just events. This futures work began with an opportunity through ASU’s Future Arts Research, looking at how art affects justice, architecture, and housing (00:03:45). Due to their creative, scaling, and business background, they assisted brilliant humans, such as scientists and poets, who received large sums of grant money and needed help standing up and running institutes and centers (00:05:35).

  • Launch of the School for the Future of Innovation and Society: Cyndi Coon launched the School for the Future of Innovation and Society for ASU in Washington, DC, with partner Brian David Johnson. Prior to this, they worked with Johnson on an event called ‘Emerge’ at ASU, where artists, scientists, and engineers reimagined the future, working on projects built out to the year 2045 (00:06:51). Johnson, who developed the threat casting methodology internally at the Intel Corporation, decided to take the method out of the enterprise for use by the federal government and US military (00:08:29).

  • Defining the Applied Futurist Role and Threat Casting Methodology: Cyndi Coon detailed their work as an applied futurist, which involves social science (research, writing, analysis, and data) combined with creative practice (imagination) to gaze into the future. The “applied” piece means they take data and turn it into functional prototypes, sketches, mockups, or script writing, acting as the protoypers for live worldbuilders (00:10:13). Threat casting involves building effects-based models to examine how something affects a human being (00:11:36).

  • The Threat Casting Exercise Process: The threat casting process involves creating a detailed persona, or specific person, that build teams get invested in, including their dog, family, and job (00:11:36). The next step is to create an adversary and make a severe disaster happen, which acts as an empathy hack to elicit better data. The team then backcasts from a future time to the present to figure out how to disrupt, mitigate, or recover from the disaster (00:12:53).

  • Future Casting and Government Work: Future casting is the sister methodology to threat casting, focusing on how to enable opportunity on purpose by creating and planning to drive toward, enable, and sustain a big opportunity. In 2017, the threat casting lab was stood up at ASU and immediately began doing government contracts through ASU and its applied research enterprise, Ashure, for classified work (00:14:00). Their primary government partner is the Army Cyber Institute at West Point, which has run many exercises and produced reports (00:15:17).

  • ASU’s Innovative Culture and Structure: Chris J Snook noted ASU’s consistent recognition as a highly innovative university and questioned what drives this reputation (00:15:17). Cyndi Coon explained that ASU is composed of five entities, including the education division, Ashure (classified work), an office for helping faculty with patents and standup businesses, the foundation, and a global international entity (00:17:31). The sixth entity is the real estate holding company, which facilitated the growth of ASU’s footprint across Phoenix, Tempe, Scottsdale, and Mesa (00:18:43).

  • The Importance of Autonomy and Silo Breaking at ASU: Cyndi Coon stated that ASU wins at innovation because of the almost absolute autonomy granted to staff, allowing them to do almost anything if they have the money, resources, and staffing. They stressed that the innovative culture is in the DNA and is not just a marketing tactic (00:21:30). Furthermore, people are penalized in their reviews if they do not break silos, requiring them to reach outside of their disciplines and cross-pollinate with others (00:22:28).

  • ASU’s Hacks for Humanity as a Microcosm of Innovation: Chris J Snook provided a small example of ASU’s autonomy by describing their time as an Entrepreneur in Residence at SkySong, the Scottsdale campus (00:24:57). They described the “Hacks for Humanity” event, which intentionally required teams to have at least three different disciplines to compete, leading to a mix of diverse individuals from art, business, and computer science (00:25:45). This requirement was purpose-built to cross-pollinate disciplines and solve problems (00:27:41).

  • Threat Casting on Higher Education and AI Readiness: Chris J Snook presented a hypothetical threat casting scenario targeting “higher education itself,” seeking to destroy the current model due to the availability of AI (00:27:41). Cyndi Coon could not speak to ASU’s current readiness for this scenario, as they moved in January 2022 to New Mexico to stand up the applied futures lab and are no longer in the loop regarding the president’s office (00:28:30). Cyndi Coon noted that their friend Andrew Maynard at ASU was one of the first to immediately stand up a class for students to understand generative AI (00:29:49).

  • The Future of Higher Education and Vocational Focus: Cyndi Coon believes that AI could serve as an educator to help with basic, monotonous work like grading or reviewing, supporting faculty and staff through AI plus automation (00:30:45). Looking ahead, they predict a massive transition in the next decade involving an appropriate return to vocational technical education, which is currently needed to fill workforce shortages in professions like plumbing, welding, and electrician work (00:31:58). Chris J Snook emphasized that these vocational jobs have a longer threat horizon than knowledge work due to the constraints on scaling humanoid robots (00:33:17).

  • Optimism as a Futurist and Human Resilience: Cyndi Coon described themself as a “ridiculous optimist” because their work involves spending time on terrible threats but also identifying how to prevent them (00:36:34). They firmly believe in human capacity, spirit, and resiliency, acknowledging that while humans cause all the problems, they are also the solution (00:39:03). Cyndi Coon’s optimistic viewpoint comes from looking past the “fires burning” and positioning themself on the other side where they are stable, healthy, and ready to build when the crisis passes (00:40:18).

  • Revisiting Survival and the Reframing of Humanity: Chris J Snook discussed the interesting analogy of nature being “at war” and how humans, who have not had to focus on survival for the last 100 years, are now having their reality reframed by machines (00:43:16). Cyndi Coon observed that this moment creates an opportunity to discover humanity in new ways (00:45:09). They noted that convenience and ease of life have led to “rudderless humans” who lack the hunger or drive for life, emphasizing that the necessity for real authenticity is where the future lies (00:46:13).

  • The Shifting Value Proposition of Money and Work: Cyndi Coon agreed that money is not everything unless one does not have any (00:48:35). They pointed out that many Americans were able to level up financially, but machines are set to radically alter intellectual white-collar spaces, displacing massive numbers of people (00:49:30). Chris J Snook suggested that the new, non-human species (AI) is built for complexity and can thrive in it, unlike humans, which presents an opportunity for “an exercise in humility” and liberation to redefine what being human means (00:51:46).

  • AI as an “Appetizer” and Focus on Human Agency: Cyndi Coon clarified that the machines are not sentient, and there is still hope for Artificial General Intelligence (AGI), but it is not yet here (00:53:43). They see AI as an “appetizer” before future advancements like quantum technology. They urged people to talk less about the machines and stay focused on the fact that humans are building, powering, and providing the bias for these models and robots (00:54:52). Chris J Snook suggested that these forces may balance out totalitarian incentives and structures (00:55:56).

  • The Disruption of Old Incentive Models: Chris J Snook recognized that existing incentive models built over the last 40 years, such as the reliance on the fiat money system, are not going to change overnight. However, they argued that the beauty of disruptive technologies is that they enable people to build new systems, drawing the best and brightest talent away from the old guard (00:57:13). This shift in energy toward human agency and new rails like crypto and Bitcoin is key to enabling good things (00:58:12).

  • The Exhaustion with the Screen and Data Sovereignty: Cyndi Coon, an advisor to women in web3, is observing an exhaustion with screens and a desire for analog, in-real-life relationships, coupled with an awakening among people that the data taken from them belongs to them. Cyndi Coon believes that this shift is leading toward data sovereignty, where individuals will own their personal data and can monetize it if they choose, which they view as the future of income (00:59:04). Chris J Snook introduced a conceptual conflict in their mind among the pragmatist, realist, and optimist views on the future (01:00:27).

  • Societal Governors and the MAG 7 Inertia: Chris J Snook discussed natural governors, such as fires, floods, and earthquakes in nature, and analogous events in business and society like market crashes or social cultural movements (01:00:27). They noted the current inertia of the MAG 7 companies, characterized by ample dry powder and capital looking for returns higher than 4%, in the context of the AI race (01:01:13). They expressed concern that Web2 dynamics and the advertising industrial complex could become irreparably manipulative due to AI, leading to negative outcomes like losing elections (01:01:56).

  • The Role of Personal AI and Hardware in the Future: Chris J Snook suggested that companies like Apple could win the AI race differently by providing the hardware necessary for personal AI (01:01:56). They noted that Apple is launching new M5 chips and selling out of Mac minis, demonstrating that other companies like Samsung and Michael Dell are also watching this trend. While cloud models will win on user experience and initial exposure to AI, the possibility of running powerful personal AI models locally on a $400 computer or a new MacBook M5 is a significant counter-inertia (01:02:45).

  • The Shifting Incentive Model and Data Ownership: Chris J Snook, who has historically been an advocate for owner data, finds hope in the shifting incentive model, particularly related to Apple, suggesting that other companies will copy this approach. Cyndi Coon emphasized that the required containers for people to manage their own data represent a significant opportunity (01:04:23). Cyndi Coon stated that the current model of large corporations collecting data will not work indefinitely due to growing anger and human nature, necessitating new models where large companies will need to pay for data (01:05:33).

  • Economic Prudence and Conscious Consumption: Chris J Snook noted that people are prioritizing their consumption differently due to pragmatism, which affects decisions regarding tuition or dining out (01:06:37). They drew a parallel with mobile phones, explaining that if consumers had to pay the full actual cost upfront, many would not buy them, illustrating how adjusted business models and financing made high-cost items affordable (01:07:36). Chris J Snook concluded that more people are waking up to planned obsolescence and are likely to become more consciously consumptive, choosing pragmatism over traditional models like college (01:08:47).

  • The Changing Business Models of Universities and Sports: Chris J Snook pointed out that universities’ business models have to change, citing the NCAA’s decision to allow athletes to monetize their Name, Image, and Likeness (NIL) as an example (01:09:41). They questioned what happens when college athletes become the norm for making $15-20 million annually, suggesting that college sports might transition into a semi-pro or minor league system (01:10:44). Cyndi Coon raised the question of what owning one’s own data means for a nine-year-old on a little league team, suggesting that agents may start fishing in that younger talent pool soon (01:12:31).

  • Name, Image, and Likeness as a Sovereign Asset: Chris J Snook shared an experience from a sports agent conference two years prior, where they advised that NIL and owning the data container represented the future for agents. They asserted that Name, Image, and Likeness is the asset that humans possess, representing their sovereignty from birthright (01:13:17). Both speakers agreed that a model for containing this personal asset is necessary, and the technology to achieve this already exists (01:14:06).

  • Ethical Concerns Regarding Meta’s Patent on Digital Ghosts: Chris J Snook brought up a recent report that Meta was granted a patent on people’s digital ghosts, which allows the company to continue posting under a deceased user’s persona in perpetuity (01:15:03). The reasoning provided in the application was that the absence of the deceased in a feed created a “bad user experience” for others, which Chris J Snook viewed as ethically grotesque and a sign of the end of that social media era (01:16:08). Cyndi Coon agreed that such actions show a lack of connection with core users, predicting that Facebook, specifically, may die when the Baby Boomer generation passes (01:17:01).

  • The Importance of Infrastructure and Data Pinch Points: Cyndi Coon suggested that the companies who are not included in the social media decline are those, like Amazon’s Bezos, who are winning with data centers, which run everything (01:18:26). Chris J Snook discussed an initiative by Meta to own the largest private undersea cabling initiative, controlling the pipes through which data flows (01:19:36). This infrastructure ownership creates a “pinch point,” giving the corporation the ability to shut off or control the flow of data that reaches LLMs and users, which they view as a potential threat to human interest (01:20:39).

  • The Opportunity for Real-Time Awareness and Decentralized Creation: Chris J Snook shifted the conversation back to the positive by highlighting the opportunity for people to become aware faster than ever before (01:21:29). They shared the example of a friend who, in four hours, used OpenClaw to create watchwar.live, an open-sourced platform that translates real-time war updates into English without media spin, demonstrating how non-coders can execute designs rapidly (01:22:19). Cyndi Coon noted that basic coding knowledge, with the help of vibe coding, is often all that is necessary to create instant, usable prototypes (01:23:13).

  • The Need for Community to Overcome AI Overwhelm: Chris J Snook emphasized that the true opportunity lies in humans building and doing things in public together, as the noise and rapid development in the AI space are overwhelming (01:24:37). They stated that even with easily accessible tools, people will hesitate to start due to fear of doing it incorrectly, but seeing others build something tangible can provide the necessary inertia (01:25:52). Cyndi Coon agreed, stressing the importance of not going it alone, and they outlined their use of applied experiential futures, prototyping workshops, and intentional build sessions to facilitate collaborative learning and practice (01:27:25).

  • Collaborative Learning and Mindset in the AI Space: Cyndi Coon’s community engagement includes “learn out louds” (LOLs) for sharing discoveries, and a dedicated group focused on human practice over AI tools (01:28:48). They also establish “collabs” for organizations to bring together cohorts who explore AI without putting the entire burden on a few individuals (01:30:16). Cyndi Coon stressed that community is the only way through the “fire hose” of constant AI changes and updates, and that their primary work is focused on human mindset to help people feel capable of taking on these challenges (01:31:13).

  • The Search for Meaning in an Abundant World: Chris J Snook and Cyndi Coon discussed how an increase in capability and creation speed does not automatically translate to more meaning in one’s life (01:33:11). Chris J Snook compared it to the problems of wealth, noting that while it is a positive problem, abundance does not buy meaning, and the ability to rapidly invent things leads to an overwhelm of invention without operating or executing at scale (01:33:55). Cyndi Coon concluded that the current “great reckoning” requires individuals to define what they care about (01:34:44).

  • Human Connection as the Core Purpose: Cyndi Coon articulated a belief that all humans share the identical purpose of prioritizing each other, with the path toward that purpose being individually chosen. They asserted that without a focus on humans, things “won’t work,” and that community is the intention of humanity (01:35:49). Cyndi Coon urged those who understand AI to recognize the terror most of humanity feels about its impact and to “build rafts” to safely guide and support others through this transition (01:36:46).

  • Scarcity, Arbitrage, and the Value of Human Experience: Chris J Snook suggested that their recent focus has shifted to finding ways to be useful and add to opportunity, which they see as beneficial for both their own self-interest and for the well-being of their family (01:39:15). Chris J Snook, validating Cyndi Coon’s “raft” analogy, suggested that there is a purpose for everyone—from the raft’s inventor to the person simply sitting in the raft (01:40:17). Chris J Snook and Cyndi Coon agreed that in the new reality, humans will be the scarce asset, prized as “experiential human beings” and “empathy engines” (01:47:13).

  • Humans as the Moat and the Future of Capital Allocation: Chris J Snook argued that the scarcity of human experience, meaning, and significance will inherently take care of the capital allocation problem (01:48:17). They concluded that “humans are the moat,” and capital will be attracted to meeting human needs, wants, and desires because these ventures will be profitable (01:49:01). Cyndi Coon advised that people should form communities to solution-build instead of venting, focusing on creating economies where people can grow, thrive, learn, and earn together (01:50:05).

  • Solving Micro Problems for Infinite Scale: Chris J Snook suggested that individuals can now solve a micro problem for themselves or small communities of common interest, and that these solutions will scale infinitely because they live in an open-source world (01:51:16). They stated that this allows for an “enlightened and selfish” approach where people solve their own problems and, subsequently, everyone else benefits. Cyndi Coon reinforced this idea by referencing an exercise that asks “What do you want more of?” which often leads participants to seek more time with direct human contact and creative engagements (01:52:07).

Discussion about this episode

User's avatar

Ready for more?