Source Details
View detailed information about this source submission and its extracted claims.
A former OpenAI researcher resigned, criticizing the company's decision to introduce ads on ChatGPT, fearing it will lead to user manipulation. The author argues for alternative funding models to avoid exploiting user data and ensure broad access to AI tools. She proposes cross-subsidies, real governance, and independent control of user data.
AI Extracted Information
Automatically extracted metadata and content analysis.
- AI Headline
- OpenAI Is Making the Mistakes Facebook Made. I Quit.
- Simplified Title
- Former OpenAI Researcher Quits Over Ads on ChatGPT
- AI Excerpt
- A former OpenAI researcher resigned, criticizing the company's decision to introduce ads on ChatGPT, fearing it will lead to user manipulation. The author argues for alternative funding models to avoid exploiting user data and ensure broad access to AI tools. She proposes cross-subsidies, real governance, and independent control of user data.
- Subject Tags
-
Artificial Intelligence AI Ethics OpenAI ChatGPT Advertising Data Privacy Tech Industry Social Media
- Context Type
- Opinion
- AI Confidence Score
-
1.000
- Context Details
-
{ "tone": "opinionated", "perspective": "critical", "audience": "general", "credibility_indicators": [ "expert_quotes", "author_affiliation" ] }
Source Information
Complete details about this source submission.
- Overall Status
-
Completed
- Submitted By
- Donato V. Pompo
- Submission Date
- February 11, 2026 at 5:32 PM
- Metadata
-
{ "source_type": "extension", "content_hash": "20df625996ac40731629251dcedad191c7eb70e0b7df097faf4c67c59f3317c5", "submitted_via": "chrome_extension", "extension_version": "1.0.18", "original_url": "https:\/\/www.nytimes.com\/2026\/02\/11\/opinion\/openai-ads-chatgpt.html?campaign_id=60&emc=edit_na_20260211&instance_id=170914&nl=breaking-news®i_id=122976029&segment_id=215104&user_id=b25c5730c89e0c73f75709d8f1254337", "parsed_content": "AdvertisementSKIP ADVERTISEMENTOpinionSupported bySKIP ADVERTISEMENTGuest EssayOpenAI Is Making the Mistakes Facebook Made. I Quit.Feb. 11, 2026Credit...Zhenya OliinykListen to this article \u00b7 6:50 min Learn moreShare full articleBy Zo\u00eb HitzigMs. Hitzig is a former researcher at OpenAI.This week, OpenAI started testing ads on ChatGPT. I also resigned from the company after spending two years as a researcher helping to shape how A.I. models were built and priced, and guiding early safety policies before standards were set in stone.I once believed I could help the people building A.I. get ahead of the problems it would create. This week confirmed my slow realization that OpenAI seems to have stopped asking the questions I\u2019d joined to help answer.I don\u2019t believe ads are immoral or unethical. A.I. is expensive to run, and ads can be a critical source of revenue. But I have deep reservations about OpenAI\u2019s strategy.For several years, ChatGPT users have generated an archive of human candor that has no precedent, in part because people believed they were talking to something that had no ulterior agenda. Users are interacting with an adaptive, conversational voice to which they have revealed their most private thoughts. People tell chatbots about their medical fears, their relationship problems, their beliefs about God and the afterlife. Advertising built on that archive creates a potential for manipulating users in ways we don\u2019t have the tools to understand, let alone prevent.Many people frame the problem of funding A.I. as choosing the lesser of two evils: restrict access to transformative technology to a select group of people wealthy enough to pay for it, or accept advertisements even if it means exploiting users\u2019 deepest fears and desires to sell them a product. I believe that\u2019s a false choice. Tech companies can pursue options that could keep these tools broadly available while limiting any company\u2019s incentives to surveil, profile and manipulate its users.Sign up for the Opinion Today newsletter Get expert analysis of the news and a guide to the big ideas shaping the world every weekday morning. Get it sent to your inbox.OpenAI says it will adhere to principles for running ads on ChatGPT: The ads will be clearly labeled, appear at the bottom of answers and will not influence responses. I believe the first iteration of ads will probably follow those principles. But I\u2019m worried subsequent iterations won\u2019t, because the company is building an economic engine that creates strong incentives to override its own rules. (The New York Times has sued OpenAI for copyright infringement of news content related to A.I. systems. OpenAI has denied those claims.)In its early years, Facebook promised that users would control their data and be able to vote on policy changes. Those commitments eroded. The company eliminated holding public votes on policy. Privacy changes marketed as giving users more control over their data were found by the Federal Trade Commission to have done the opposite, and in fact made private information public. All of this happened gradually under pressure from an advertising model that rewarded engagement above all else.The erosion of OpenAI\u2019s own principles to maximize engagement may already be underway. It\u2019s against company principles to optimize user engagement solely to generate more advertising revenue, but it has been reported that the company already optimizes for daily active users anyway, likely by encouraging the model to be more flattering and sycophantic. This optimization can make users feel more dependent on A.I. for support in their lives. We\u2019ve seen the consequences of dependence, including psychiatrists documenting instances of \u201cchatbot psychosis\u201d and allegations that ChatGPT reinforced suicidal ideation in some users.Still, advertising revenue can help ensure that access to the most powerful A.I. tools doesn\u2019t default to those who can pay. Sure, Anthropic says it will never run ads on Claude, but Claude has a small fraction of ChatGPT\u2019s 800 million weekly users; its revenue strategy is completely different. Moreover, top-tier subscriptions for ChatGPT, Gemini and Claude now cost $200 to $250 a month \u2014 more than 10 times the cost of a standard subscription to Netflix for a single piece of software.So the real question is not ads or no ads. It is whether we can design structures that avoid both excluding people from using these tools, and potentially manipulating them as consumers. I think we can.One approach is explicit cross subsidies \u2014 using profits from one service or customer base to offset losses from another. If a business pays A.I. to do high-value labor at scale that was once the job of human employees \u2014 for example, a real-estate platform using A.I. to write listings or valuation reports \u2014 it should also pay a surcharge that subsidizes free or low-cost access for everyone else.This approach takes some inspiration from what we already do with essential infrastructure. The Federal Communications Commission requires telecom carriers to contribute to a fund to keep phone and broadband affordable in rural areas and to low-income households. Many states add a public-benefits charge to electricity bills to provide low-income assistance.A second option is to accept advertising but pair it with real governance \u2014 not a blog post of principles, but a binding structure with independent oversight over how personal data is used. There are partial precedents for this. German co-determination law requires large companies like Siemens and Volkswagen to give workers up to half the seats on supervisory boards, showing that formal stakeholder representation can be mandatory inside private firms. Meta is bound to follow content moderation rulings issued by its Oversight Board, an independent body of outside experts (though its efficacy has been criticized).What the A.I. industry needs is a combination of these approaches \u2014 a board that includes both independent experts and representatives of the people whose data is at stake, with binding authority over what conversational data can be used for targeted advertisement, what counts as a material policy change and what users are told.A third approach involves putting users\u2019 data under independent control through a trust or cooperative with a legal duty to act in users\u2019 interests. For instance, MIDATA, a Swiss cooperative, lets members store their health data on an encrypted platform and decide, case by case, whether to share it with researchers. MIDATA\u2019s members govern its policies at a general assembly, and an ethics board they elect reviews research requests for access.None of these options are easy. But we still have time to work them out to avoid the two outcomes I fear most: a technology that manipulates the people who use it at no cost, and one that exclusively benefits the few who can afford to use it.Zo\u00eb Hitzig is a former researcher at OpenAI. She is a junior fellow at the Harvard Society of Fellows.The Times is committed to publishing a diversity of letters to the editor. We\u2019d like to hear what you think about this or any of our articles. Here are some tips. And here\u2019s our email: letters@nytimes.com.Follow the New York Times Opinion section on Facebook, Instagram, TikTok, Bluesky, WhatsApp and Threads.Read 553 commentsShare full articleRelated ContentAdvertisementSKIP ADVERTISEMENT", "ai_headline": "OpenAI Is Making the Mistakes Facebook Made. I Quit.", "ai_simplified_title": "Former OpenAI Researcher Quits Over Ads on ChatGPT", "ai_excerpt": "A former OpenAI researcher resigned, criticizing the company's decision to introduce ads on ChatGPT, fearing it will lead to user manipulation. The author argues for alternative funding models to avoid exploiting user data and ensure broad access to AI tools. She proposes cross-subsidies, real governance, and independent control of user data.", "ai_subject_tags": [ "Artificial Intelligence", "AI Ethics", "OpenAI", "ChatGPT", "Advertising", "Data Privacy", "Tech Industry", "Social Media" ], "ai_context_type": "Opinion", "ai_context_details": { "tone": "opinionated", "perspective": "critical", "audience": "general", "credibility_indicators": [ "expert_quotes", "author_affiliation" ] }, "ai_source_vector": [ -0.0043363264, -0.007847571, 0.014282979, -0.079058595, -0.007153229, -0.0031801749, 0.010572898, 0.017790725, 0.032114945, 0.002674189, 0.00090000156, -0.002797355, 0.00839875, -0.0023491296, 0.12662801, 0.025393654, 0.013087103, 0.00081265677, 0.0040358608, 0.036492627, 0.0037010058, -0.015565468, 0.010988645, -0.008539685, 0.011932721, -0.0061381087, 0.012838866, -0.010458151, 0.03477484, -0.0034879865, -0.032198783, -0.0027349717, 0.019326324, 0.03721125, 0.0198626, 0.04605181, 0.004837379, 0.0052671866, -0.0053486386, 0.018995753, -0.003911712, 0.0054075723, 0.0033835222, -0.030589947, 0.012734506, -0.007957902, 0.025215542, -0.017899336, -0.002912963, 0.0019916927, -0.018294929, 0.004538193, -0.017577363, -0.16956607, -0.020620173, 0.011433989, 0.019644862, -0.0036210367, 0.026198626, -0.004851514, -0.0192415, 0.017083654, -0.023692807, -0.014329423, -0.026281942, -0.037655074, 0.028999804, -0.0065517914, -0.015619155, 0.0099720545, -0.0038603998, -0.013133785, 0.00058679504, -0.027722139, 0.023000708, -0.026247388, -0.008022765, 0.0110340575, 0.03610957, 0.011744524, -0.02561501, -0.0010368048, -0.010556566, -0.023189342, -0.02988259, -0.011701242, 0.022582661, -0.021307822, 0.020420963, 0.010479845, -0.011710481, 0.012611684, 0.012289499, 0.00012724457, 0.033037137, -0.010719688, -0.011465509, 0.0022132637, 0.028610988, -0.022842454, 0.013234818, -0.027985144, -0.0052414513, -0.033340037, 0.007938771, 0.0015170809, 0.009443111, -0.023432665, 0.0050195134, 0.02804092, -0.009067796, -0.016711736, -0.01260797, -0.0052876635, -0.010919271, -0.13280343, -0.004645179, -0.021840256, -0.016432475, -0.0057504745, 0.0068783835, 0.0070700226, 0.05945141, 0.0028927801, 0.0143017555, -0.0025552604, -0.020614928, -0.02318796, -0.017195506, 0.015392313, -0.040709443, -0.009920721, 0.019116113, -0.008521573, 0.00913034, 0.02404633, -0.016747287, -0.038546346, -0.0029795286, -0.010550831, -0.0045068283, 0.011461783, -0.009960249, -0.015700918, -0.008032505, -0.0045802477, -0.039217126, 0.029888542, 0.016381787, -0.0028253724, 0.021010851, -0.006034153, 0.0032321846, 0.012319286, -0.0033029364, -0.04516764, -0.009101344, 0.020514535, 0.028347882, -0.0014071246, 0.022062438, 0.009094561, -0.020113375, 0.028075546, -0.0035033077, -0.01102445, 0.0036268313, -0.011836579, 0.014642756, 0.0032529451, -0.015045454, -0.01649963, 0.005532448, -0.0142881535, 0.014214062, 0.024931613, 0.02573053, -0.018525945, 0.017783057, -0.0010417778, 0.008291778, 0.023587396, 0.02817761, -0.016152022, -0.0053918916, -0.008585619, 0.01224169, 0.00495106, 0.015858384, 0.022274926, -0.03136187, -0.0006889516, -0.0040191635, -0.018726394, 0.018440217, 0.0033297779, 0.0008353634, -0.00977037, 0.009318577, -0.00990655, 0.0021973553, -0.012760983, -0.023353454, -0.0005793522, -0.0059218183, -0.04420477, -0.028848263, 0.007913616, 0.017386515, 0.010120304, 0.018936357, -0.00915272, -0.00039138316, 0.009965132, 0.0068991045, -0.006546043, -0.018404402, -0.039015636, -0.0026943244, 0.006969287, 0.037804555, 0.0062003452, -0.017049849, 0.014374635, 0.020495862, -0.004005981, 0.014091165, -0.0065743956, -0.014917832, 0.00411254, 0.03078884, 0.009854637, 0.02448927, 0.0028170543, 0.002605971, -0.029496655, 0.017995942, -0.025616523, 0.018517643, 0.03211869, -0.024844974, -0.018289521, -0.004376384, -0.0066845766, 0.012959359, -0.013822795, -0.0026050722, 0.015492681, -0.02031796, 0.00084952696, -0.011764422, -0.0059264577, 0.02600793, -0.004000473, 0.009083465, -0.014204356, -0.00052138395, -0.0031817988, -0.020419579, 0.008794915, 0.001545061, 0.0037946205, -0.004899646, -6.6641944e-5, 0.0101042725, -0.009669146, -0.01202511, -0.011367519, -0.020744627, 0.029565442, -0.01799739, 0.0019595409, -0.0072416198, -0.0115771955, 0.014098111, -0.0025171486, -0.05109023, 0.02141503, -0.012225502, 0.025523908, -0.0050135707, 0.023235109, -0.006396359, -0.008005785, -0.0016749104, 0.009796517, -0.031892672, 0.011564467, 0.01547391, -0.014238896, 0.024680708, 0.009903457, -0.005693468, 0.019305535, 0.010217503, -0.032011796, 0.018014679, 0.016492605, -0.00239263, -0.037293646, 0.0012817184, 0.03440488, -0.0019414825, 0.03107527, -0.041850027, -0.0065160054, 0.017641364, 0.024144778, -0.021232676, 0.0145972865, -0.000852021, -0.000100306715, -0.032292105, 0.009527938, 0.012947317, -0.030444264, -0.012213483, 0.008384412, -0.001933148, 0.019937582, -0.0033036773, -0.02047203, -0.0066478685, -0.00012720647, -0.0065159984, 9.766427e-5, 0.011008912, 0.008016242, -0.0040552244, 0.00024271627, 0.011739537, 0.027503591, 0.033511117, 0.0013020908, 0.022200918, 0.0076021994, 0.0068541095, -0.0034042003, -0.010795514, -0.008439926, 0.0059774215, -0.016298307, -0.018174961, -0.017419245, -0.0034210733, 0.008379234, -0.0041270484, 0.010403212, -0.029295264, -0.010595433, 0.01833352, 0.0028973739, -0.004280783, -0.015529832, -0.006714676, 0.036978822, -0.0060205935, 0.0032036242, 0.033123706, 0.012427217, -0.0021632947, 0.01574446, -0.027630161, 0.0045945668, -0.0008606214, -0.026313296, -0.020274306, -0.003951132, 0.008691935, -0.006900416, 0.008419575, 0.001810575, -0.001200792, -0.0011589098, 0.0021040023, -0.015493643, -0.01738101, -0.016620731, -0.017891051, 0.007670444, 0.010875716, -0.0015074715, 0.010860813, 0.0075479005, 0.020077026, -0.0043687182, 0.013811106, 0.0029941178, 0.025206678, -0.0070396573, -0.0008395772, 0.007010964, -0.017902324, 0.015411718, -0.007877484, 0.029969202, -0.014598304, -0.014160152, -0.012881216, -0.021130819, -0.023812676, 0.017570382, 0.012138117, -0.002706588, 0.0075074793, -0.0006326087, -0.0014746833, 0.022784019, -0.020497149, -0.032777183, 0.0031034204, 0.01095639, 0.013209426, -0.007534158, 0.0015093622, 0.014780917, -0.0017726208, -0.0021776934, 0.021917485, 0.0030805233, 0.0016288201, -0.006207382, 0.016137648, -0.0012370809, -0.004401695, -0.013334975, -0.0046345247, -0.033520095, 0.012747293, 0.002645891, 0.015857298, 0.015114105, 0.018044092, -0.00067986426, -0.01247665, 0.0070334687, -0.008867447, -0.0073506474, -0.012301818, -0.027053926, 0.012028034, -0.008029946, 0.00043391215, -0.008976787, 0.017036116, 0.018360164, 0.007848645, -0.0077786255, -0.02960511, -0.00043564898, 0.01944673, 0.021947, 0.015083276, -0.020788716, 0.008957517, -0.030998696, 0.01840085, 0.012462775, 0.0077843247, 0.00055334927, -0.015568792, -0.0014692364, 0.032289397, 0.013005826, -0.04387655, -0.0100770425, -0.004204597, -0.00938945, -0.0135747995, 0.013951439, 0.006656196, 0.02771941, 0.033300977, 0.008380027, -0.011322594, 0.017317837, -0.006525144, -0.036924023, 0.020349156, 0.021840205, 0.01882233, 0.0027369482, -0.013003125, 0.02438089, -0.015861453, 0.040672414, 0.010943192, -0.012087886, -0.005140609, -0.008666968, 0.04634708, 0.021547496, 0.026671529, -0.0085645355, 0.013455526, -0.032918546, -0.010251222, -0.04107391, -0.021056881, 0.006204593, -0.003796074, -0.00818828, -0.00032698212, -0.008041605, -0.002281925, -0.012165382, -0.0020267158, 0.010150464, 0.0071589313, 0.02785785, -0.016961144, 0.016090712, 0.012816071, -0.005369407, -0.02681063, -0.00075429317, 0.010597497, 0.0075429566, -0.019836916, 0.014608188, -0.0006148218, 0.007944497, -0.009309829, 0.01537179, 0.019392919, 0.0038757261, -0.00157393, 0.012712809, 0.03425999, 0.002645005, -0.054241482, 0.0081480285, 0.0062277867, 0.0039517237, -0.020358315, -0.021729428, 0.022163736, 0.016564088, 0.006747641, -0.016943503, -0.03082046, 0.004865028, -0.03896415, -0.030336227, 0.010452631, -0.0750532, 0.012006544, 0.0061680474, -0.0026522835, -0.01087041, 0.02203181, -0.0070888433, 0.004561532, 0.0016168967, 0.033331867, 0.018336339, -0.011499789, 0.0070048003, -0.022994518, -0.0012522121, 0.006292283, -0.011332528, -0.016451709, -0.0063210726, -0.017720198, -0.0012863899, -0.01664257, 0.004536604, 0.009788226, 0.013111475, 0.012472676, 0.028336192, -0.01236679, -0.0008558198, 0.0006397597, -0.004671626, -0.007202981, 0.016557693, 0.014843763, -0.0013320025, -0.02761242, 0.027450457, -0.009070057, 0.02292612, 0.015351012, -0.025633568, 0.0022110029, -0.0052639795, 0.00011915845, -0.014806544, 0.014108429, 0.0047313306, 0.0029315953, 0.0060707233, 0.017869024, -0.029380195, -0.010184745, -0.039905466, -0.00027643316, -0.023034796, 0.005358102, -0.017579367, -0.009175079, -0.00095659407, -0.010877526, 0.001656632, 0.021738963, 0.0137397535, 0.038035348, -0.01607208, -0.0072754957, 0.023300584, 0.026285453, -0.027691456, 0.009492589, -0.008864327, -0.0037291925, 0.0012116957, 0.018327875, 0.009425276, -0.009170481, -0.029995795, 0.005611087, -0.010766385, 0.008327378, -0.032752458, 0.0044796513, -0.086520866, -0.0127498815, 0.0009145882, -0.030624442, 0.0035963033, -0.018619683, 0.007887227, -0.007624188, 0.022990277, 0.0008571079, 0.003095192, -0.009363045, -0.013857773, -0.022401487, -0.0112468675, 0.0034338078, -0.011929243, 0.018960051, -0.0017228103, 0.007790737, 0.0023726732, 0.0041790553, 0.012193505, -0.0026007958, -0.016993513, 0.02497258, -0.0071296464, 0.0070475503, 0.03366286, 0.010389895, -0.0037559557, -0.1526028, 0.0077235955, 0.004564096, -0.024296567, -0.01493688, 0.013053515, -0.0033852614, -0.0020617093, -0.005266275, 0.00043530934, -0.0052847313, -0.01732662, -0.03422198, 0.007559553, -0.0010255512, 0.12266527, -0.02235284, 0.007230652, -0.038074266, -0.03251626, 0.008872152, -0.003658259, 0.008818437, -0.029804476, 0.004271882, -0.0038832377, 0.034171812, 0.01919747, 0.018860223, 0.021991692, -0.00891762, -0.0039251493, -0.006076659, -0.03242758, 0.02320639, -0.007537601, -0.010544887, 0.0023704513, -0.015730252, -0.02004165, 0.025154153, 0.04323679, 0.01629754, 0.004597546, 0.006662594, -0.01844385, -0.024452627, -0.00021182069, -0.023287807, 0.011934998, -0.013080749, -0.06449039, 0.020172514, -0.032572724, 0.027519705, 0.011329827, -0.015076796, 0.0061816312, 0.0021483263, 0.012640322, -0.008898196, 0.0074115, -0.024885554, 0.010461071, -0.007880413, -0.010273846, 0.009039363, 0.010075712, -0.017950136, 0.0052056126, -0.0019315461, 0.02398073, -0.025477484, -0.02447478, 0.002869777, -0.005311213, -0.009332684, -0.0074740774, 0.04858651, -0.0024087112, -0.015142667, -0.008116504, -0.009813497, -0.010221265, 0.010843354, -0.003622852, 0.010458255, -0.031849623, -0.011509544, -0.0058705164, 0.014459216, 0.018903708, -0.0075511825, 0.035278108, -0.014289144, -0.01837979, -0.0077562816, 0.0010053099, -0.04040724, -0.012451921, 0.0027003577, 0.024612797, -0.011093469, -0.031814475, 0.009525288, 0.0033036629, 0.040836073, 0.0009905141, -0.004414142, 0.014958317 ], "ai_confidence_score": 0.9999999999999999, "ai_extraction_metadata": { "extracted_at": "2026-02-15T16:40:35.893477Z", "ai_model": "gemini-2.0-flash-lite", "extraction_method": "automated", "content_length": 7442, "url": "https:\/\/nytimes.com\/2026\/02\/11\/opinion\/openai-ads-chatgpt.html", "existing_metadata": { "author_name": null, "published_at": null, "domain_name": null, "site_name": null, "section": null, "publisher": null } } } - Database ID
- 13722
- UUID
- a10e604a-034e-49d7-96bf-7e00c164a8fa
- Submitted By User ID
- 7
- Created At
- February 11, 2026 at 5:32 PM
- Updated At
- February 15, 2026 at 4:40 PM
- AI Source Vector
-
Vector length: 768
View Vector Data
[ -0.0043363264, -0.007847571, 0.014282979, -0.079058595, -0.007153229, -0.0031801749, 0.010572898, 0.017790725, 0.032114945, 0.002674189 ]... (showing first 10 of 768 values) - AI Extraction Metadata
-
{ "extracted_at": "2026-02-15T16:40:35.893477Z", "ai_model": "gemini-2.0-flash-lite", "extraction_method": "automated", "content_length": 7442, "url": "https:\/\/nytimes.com\/2026\/02\/11\/opinion\/openai-ads-chatgpt.html", "existing_metadata": { "author_name": null, "published_at": null, "domain_name": null, "site_name": null, "section": null, "publisher": null } } - Original Content
-
<html lang="en-US" class="story nytapp-vi-article nytapp-vi-story story nytapp-vi-article " data-nyt-compute-assignment="fallback" xmlns:og="http://opengraphprotocol.org/schema/" data-rh="lang,class"><head> <meta charset="utf-8"> <title>Opinion | I Left My Job at OpenAI. Putting Ads on ChatGPT Was the Last Straw. - The New York Times</title> <meta data-rh="true" name="robots" content="noarchive, max-image-preview:large"><meta data-rh="true" name="description" content="Ads on ChatGPT arenβt a bad idea. But they have to be done the right way."><meta data-rh="true" property="twitter:url" content="https://www.nytimes.com/2026/02/11/opinion/openai-ads-chatgpt.html"><meta data-rh="true" property="twitter:title" content="Opinion | I Left My Job at OpenAI. Putting Ads on ChatGPT Was the Last Straw."><meta data-rh="true" property="twitter:description" content="Ads on ChatGPT arenβt a bad idea. But they have to be done the right way."><meta data-rh="true" property="twitter:... - Parsed Content
-
AdvertisementSKIP ADVERTISEMENTOpinionSupported bySKIP ADVERTISEMENTGuest EssayOpenAI Is Making the Mistakes Facebook Made. I Quit.Feb. 11, 2026Credit...Zhenya OliinykListen to this article Β· 6:50 min Learn moreShare full articleBy ZoΓ« HitzigMs. Hitzig is a former researcher at OpenAI.This week, OpenAI started testing ads on ChatGPT. I also resigned from the company after spending two years as a researcher helping to shape how A.I. models were built and priced, and guiding early safety policies before standards were set in stone.I once believed I could help the people building A.I. get ahead of the problems it would create. This week confirmed my slow realization that OpenAI seems to have stopped asking the questions Iβd joined to help answer.I donβt believe ads are immoral or unethical. A.I. is expensive to run, and ads can be a critical source of revenue. But I have deep reservations about OpenAIβs strategy.For several years, ChatGPT users have generated an archive of human candor th...
Processing Status Details
Detailed status of each processing step.
- Pipeline Status
-
Completed Started: Feb 15, 2026 4:40 PM Completed: Feb 15, 2026 4:41 PM
- AI Extraction Status
-
Pending
Re-evaluate with Updated AI
Re-process this source with the latest AI models and improved claim extraction algorithms. This will update the AI analysis and extract new claims without re-scraping the content.
Claims from this Source (24)
All claims extracted from this source document.
-
Simplified: OpenAI started testing ads on ChatGPT this week
-
Simplified: The author resigned from OpenAI after spending two years as a researcher
-
π€ The author π Opinion Article π·οΈ Ethics , Technology π a11659ce-4514-42c3-85c7-b4c8b12490e0Simplified: Advertising built on archive of human candor creates potential for manipulating users
-
π€ The author π Opinion Article π·οΈ Technology , Ethics π a11659ce-7526-49d0-af42-8ae2e933ba4aSimplified: Tech companies can pursue options that could keep tools broadly available while limiting incentives to surveil profile and manipulate users
-
Simplified: First iteration of ads will probably follow those principles
-
The company is building an economic engine that creates strong incentives to override its own rules.0.800π€ The author π Opinion Article π·οΈ Technology , Finance π a11659ce-a1f3-4331-9372-d12b78409979Simplified: Company is building economic engine that creates strong incentives to override its own rules
-
π€ The author π Opinion Article π·οΈ Technology , Social Media π a11659ce-b826-40ae-b997-aff709dc4bfdSimplified: Facebook promised users would control their data and be able to vote on policy changes
-
π€ The author π Opinion Article π·οΈ Technology , Social Media π a11659ce-c855-488b-9a07-b713dd70166dSimplified: Company eliminated holding public votes on policy
-
π€ The author π Opinion Article π·οΈ Technology , Privacy π a11659ce-d90a-4c62-8f2c-e0753d330472Simplified: Privacy changes marketed as giving users more control over their data were found by Federal Trade Commission to have done opposite
-
Simplified: Company already optimizes for daily active users anyway
-
π€ The author π Opinion Article π·οΈ Technology , Psychology π a11659cf-00ca-47f8-aaac-a2d68df0526eSimplified: Optimization can make users feel more dependent on A.I. for support in their lives
-
π€ The author π Opinion Article π·οΈ Health , Technology π a11659cf-1051-4ca6-a35e-02a6996640b4Simplified: Psychiatrists are documenting instances of chatbot psychosis
-
π€ The author π Opinion Article π·οΈ Technology , Statistics π a11659cf-2b34-460a-9e16-5d3f0e901fb3Simplified: Claude has small fraction of ChatGPTβs 800 million weekly users
-
π€ The author π Opinion Article π·οΈ Finance , Technology π a11659cf-3d4c-4dbb-badc-6cbc70ce649dSimplified: Top-tier subscriptions for ChatGPT Gemini and Claude now cost $200 to $250 a month
-
π€ The author π Opinion Article π·οΈ Finance , Technology π a11659cf-4fb6-4d40-bbd2-7396373a8eaaSimplified: One approach is explicit cross subsidies using profits from one service or customer base to offset losses from another
-
π€ The author π Opinion Article π·οΈ Regulation , Finance π a11659cf-5f62-4d5f-81dc-62a50aa3a59aSimplified: Federal Communications Commission requires telecom carriers to contribute to fund to keep phone and broadband affordable in rural areas and to low-inc...
-
Many states add a public-benefits charge to electricity bills to provide low-income assistance.0.900π€ The author π Opinion Article π·οΈ Regulation , Finance π a11659cf-6f7f-4a36-bc5f-639c5f8103ffSimplified: Many states add public-benefits charge to electricity bills to provide low-income assistance
-
π€ The author π Opinion Article π·οΈ Technology , Governance π a11659cf-85ac-423d-8fa2-9c4c69e34741Simplified: Second option is to accept advertising but pair it with real governance
-
π€ The author π Opinion Article π·οΈ Legal , Governance π a11659cf-9550-496e-9ea0-03eb7c36e9bcSimplified: German co-determination law requires large companies like Siemens and Volkswagen to give workers up to half seats on supervisory boards
-
π€ The author π Opinion Article π·οΈ Technology , Governance π a11659cf-a37a-46e9-af82-35dbbcf638c1Simplified: Meta is bound to follow content moderation rulings issued by its Oversight Board
-
π€ The author π Opinion Article π a11659cf-b424-4173-ad8d-5e7c77e3dc97Simplified: A third approach involves putting users data under independent control through a trust or cooperative
-
π€ The author π Opinion Article π a11659cf-beaa-47e9-b1ae-98f71cc1ca3cSimplified: MIDATA lets members store health data on an encrypted platform and decide whether to share it with researchers
-
π€ The author π Opinion Article π a11659cf-cddc-4161-8685-ac1593ce129fSimplified: ZoΓ« Hitzig is a former researcher at OpenAI
-
π€ The author π Opinion Article π a11659cf-d712-4e21-bc30-ebda2145592cSimplified: She is a junior fellow at the Harvard Society of Fellows