{"id":838,"date":"2020-08-10T12:13:44","date_gmt":"2020-08-10T17:13:44","guid":{"rendered":"http:\/\/capertongillett.com\/blog\/?p=838"},"modified":"2020-08-10T12:13:44","modified_gmt":"2020-08-10T17:13:44","slug":"ai-is-racist-and-sexist-because-we-were-first","status":"publish","type":"post","link":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/","title":{"rendered":"AI is racist and sexist because we were first"},"content":{"rendered":"\n<p>Artificial intelligence is a handy tool. It\u2019s also somewhat misleadingly named tool \u2014 as much emphasis as we tend to put on the \u201cintelligence\u201d part of it, it\u2019s the \u201cartificial\u201d part that could benefit from more emphasis. So I\u2019m going to talk about AI, and as befits such a thoroughly modern and high-tech subject, I\u2019m going to start by talking about this chick named Shirley circa 1950.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Shirley you jest<\/h2>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"alignright size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/capertongillett.com\/blog\/wp-content\/uploads\/2020\/08\/shirley-card.jpg\" alt=\"\" class=\"wp-image-841\" width=\"277\" height=\"400\"\/><figcaption>The actual Shirley.<\/figcaption><\/figure><\/div>\n\n\n\n<p>Back in the mid-1950s, when commercial photo processing was just becoming a thing and Kodak was sending its first color photo printers out into the world to independent photo labs, the company came up with a clever calibration kit. It sent out color prints and unexposed negatives, and when you processed the negatives such that they matched the prints, congratulations! Your printer was calibrated properly.<\/p>\n\n\n\n<p>Originally, all the prints were photos of Kodak model Shirley Page, who would have to sit for literally hundreds and hundreds of photos. Over time, of course, other models were used as well, all of them smiling, bright-eyed, brunette, white, and named Shirley. (Okay, not that last one. But the prints were still called \u201c<a href=\"https:\/\/www.npr.org\/2014\/11\/13\/363517842\/for-decades-kodak-s-shirley-cards-set-photography-s-skin-tone-standard\" target=\"_blank\" rel=\"noreferrer noopener\">Shirley cards<\/a>.\u201d) The result? Decades and decades of perfectly-exposed white people and Black people who looked like they were <a href=\"https:\/\/theglowup.theroot.com\/annie-leibovitz-bashed-for-simone-biles-vogue-photos-f-1844375368\" target=\"_blank\" rel=\"noreferrer noopener\">photographed by Annie Leibovitz<\/a> (YEAH, I SAID IT). Kodak didn\u2019t release its first non-white Shirley card until the 1970s (and even then, <a href=\"https:\/\/www.nytimes.com\/2019\/04\/25\/lens\/sarah-lewis-racial-bias-photography.html\" target=\"_blank\" rel=\"noreferrer noopener\">it was the complaints of furniture companies and chocolate makers<\/a>, not BIPOC camera owners, that led to improvements in products and processing that resulted in film that worked for darker skin tones).<\/p>\n\n\n\n<p>There\u2019s no such thing as a racist photo printer. It\u2019s a machine. They have no motivation. (A photographer who never bothered to learn how to light dark skin? That\u2019s another story.) But a printer calibrated based on a flattering photo of a white model, shot by a white photographer, for use with a product that was, at the time, owned mostly by white people, was inevitably going to produce poor results for anyone who didn\u2019t look like Shirley.<\/p>\n\n\n\n<p>Which brings us, naturally, to Google Cloud Vision seeing bearded women in 2020.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Bearded ladies<\/h2>\n\n\n\n<p>This week, marketing agency <a href=\"https:\/\/www.wundermanthompson.com\/insight\/ai-and-gender-bias\" target=\"_blank\" rel=\"noreferrer noopener\">Wunderman Thompson\u2019s data group released a study<\/a> looking into how different visual AI systems handle images of men and women wearing PPE. They took 265 images each of men and women with an assortment of settings, mask types, and photo quality levels and ran them against Google Cloud Vision, IBM\u2019s Watson, and Microsoft Azure\u2019s Computer Vision to see what the systems thought they were seeing.<\/p>\n\n\n\n<p>All of the systems struggled to spot masks on the faces of men or women, although they were twice as good at identifying male mask-wearers than female. The AIs had some interesting thoughts about what they were seeing.<\/p>\n\n\n\n<ul class=\"wp-block-list\"><li>Microsoft was more likely to ID the mask a fashion accessory on a woman (40%) than on a man (13%).<\/li><li>It was about as likely to identify it as an outrageous amount of lipstick on a woman (14%) as it was a beard on a man (12%).<\/li><li>Watson tagged the mask as \u201crestraint\/chains\u201d and \u201cgag\u201d on 23% of women\u2019s photos and just 10% of men\u2019s.<\/li><li>Google identified PPE on 36% of men and just 19% of women.<\/li><li>On 15% of men\u2019s photos and 19% of women\u2019s, it mistook the PPE for duct tape.<\/li><\/ul>\n\n\n\n<p>An AI is only as good as the human being programming it and the data set it\u2019s being trained with. What does it say about both that Watson looks at a woman in PPE and sees her in restraints nearly a quarter of the time, or that Google Cloud Vision is more likely to think a woman has duct tape over her mouth than a surgical mask?<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Not what you think<\/h2>\n\n\n\n<p>What it doesn\u2019t mean is that some dickhole at Google was drunk at the office one night top-loading Cloud Vision\u2019s training set with women gagged with duct tape. It does mean that an AI is only as good as the data set available to it, and those data sets exist in a biased world. In their report, WT contrasts Google Image search results for \u201cduct tape man\u201d and \u201cduct tape woman.\u201d If \u201cduct tape man\u201d is more likely to turn up a photo of a man in an ill-advised Tin Man costume and \u201cduct tape woman\u201d is more likely to have tape over her mouth and mascara running down her cheeks, then yes, an AI that feeds on those same images will get some ideas about what a woman with half her face covered might be up to.<\/p>\n\n\n\n<p>WT also mentions <a href=\"https:\/\/thenextweb.com\/insider\/2016\/03\/24\/microsoft-pulled-plug-ai-chatbot-became-racist\/\" target=\"_blank\" rel=\"noreferrer noopener\">Tay, Microsoft\u2019s AI chatbot<\/a> that went from \u201chello world\u201d to \u201c9\/11 was an inside job and feminists should die\u201d in just 16 hours because of the trolls it was interacting with and the data it was scraping. I\u2019m confident that Tay\u2019s engineering team didn\u2019t, collectively, believe those things. But I\u2019ll bet it didn\u2019t have a lot of women on it, \u2018cause they would have warned Tay about the \u201cfeminists should burn in hell\u201d crowd before she went live.<\/p>\n\n\n\n<p><a href=\"https:\/\/www.wired.com\/story\/best-algorithms-struggle-recognize-black-faces-equally\/\">When visual AIs are given training sets that are mostly white men<\/a>, they\u2019ll provide more accurate results for white men than for people who aren\u2019t one or both of those things. <a href=\"https:\/\/www.theregister.com\/2020\/07\/01\/mit_dataset_removed\/\" target=\"_blank\" rel=\"noreferrer noopener\">When a neural network is trained from a dataset full of racial slurs and crude anatomical language<\/a>, it\u2019ll be more likely to decide that a woman holding a baby is a\u2026 never mind. When a black teenager is murdered, and the news posts the most aggressive-looking photo it can find on Facebook, and a white college student is convicted of sexual assault, and the news posts a smiling photo from fraternity picture day, human beings aren\u2019t the only ones who get a messed-up idea of what criminals and victims look like.<\/p>\n\n\n\n<p>That doesn\u2019t mean AI is inherently evil. It just means it isn\u2019t any less flawed than the rest of us. AI isn\u2019t smarter than we are \u2014 just faster. And so we shouldn\u2019t depend on AI to do anything <em>better<\/em> than we do \u2014 just faster.&nbsp;<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Faster, Pussycat<\/h2>\n\n\n\n<p>As pointed out by TheNextWeb, algorithms are better at telling you the past than the future. Predictive policing systems tell you where arrests have been made, not where crime is likely to happen. Sentencing algorithms tell you how sentences have been handed down, not what sentences are warranted. Hiring algorithms tell you which candidates have always been hired for a position, not which ones will do the job best. So if you want to get what you\u2019ve been getting, but faster and with less effort, an AI trained on historical data is a great way to do that.<\/p>\n\n\n\n<p>Just two months ago, <a href=\"https:\/\/www.zdnet.com\/article\/ibm-announces-exit-of-facial-recognition-business\/\" target=\"_blank\" rel=\"noreferrer noopener\">IBM announced that it\u2019s getting out of the facial-recognition game<\/a>, because of the potential for bias and, thus, for their technology to be used in ways that violate people\u2019s privacy and human rights. (I mean, if it can only tell a woman\u2019s wearing a mask five percent of the time, maybe visual AI isn\u2019t its strength anyway, but that\u2019s neither here nor there.) But that\u2019s an important reminder: This is a highly subjective tool with a huge potential to tell you exactly whatever it is you want to hear.<\/p>\n\n\n\n<p>Think of visual AI, and other kinds of AI, like a self-driving car: They\u2019re fun, and they can be handy tools when you need to get somewhere and want to be able to answer email while your car does its own driving. But if it\u2019s really, really important that you get to the hospital before your appendix ruptures, your best bet is to have an actual human behind the wheel. It\u2019s better than arresting the wrong person because <a href=\"https:\/\/thenextweb.com\/neural\/2020\/06\/24\/stop-calling-it-bias-ai-is-racist\/\" target=\"_blank\" rel=\"noreferrer noopener\">your facial recognition system thinks all Black people look alike<\/a>.<\/p>\n\n\n\n<p>Um, I mean, getting into an accident. It&#8217;s better than getting into an accident.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Artificial intelligence is a handy tool. It\u2019s also somewhat misleadingly named tool \u2014 as much emphasis as we tend to put on the \u201cintelligence\u201d part of it, it\u2019s the \u201cartificial\u201d &hellip;<\/p>\n","protected":false},"author":1,"featured_media":841,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[30,60,66],"class_list":["post-838","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-the-biz","tag-ethics","tag-race","tag-technology"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.3 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>AI is racist and sexist because we were first - Caperton Gillett | The Blog<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"AI is racist and sexist because we were first - Caperton Gillett | The Blog\" \/>\n<meta property=\"og:description\" content=\"Artificial intelligence is a handy tool. It\u2019s also somewhat misleadingly named tool \u2014 as much emphasis as we tend to put on the \u201cintelligence\u201d part of it, it\u2019s the \u201cartificial\u201d &hellip;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/\" \/>\n<meta property=\"og:site_name\" content=\"Caperton Gillett | The Blog\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/capertongcreative\/\" \/>\n<meta property=\"article:author\" content=\"https:\/\/www.facebook.com\/capertongcreative\" \/>\n<meta property=\"article:published_time\" content=\"2020-08-10T17:13:44+00:00\" \/>\n<meta name=\"author\" content=\"Caper\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@CapeGCreative\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Caper\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"7 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/ai-is-racist-and-sexist-because-we-were-first\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/ai-is-racist-and-sexist-because-we-were-first\\\/\"},\"author\":{\"name\":\"Caper\",\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/#\\\/schema\\\/person\\\/a8b294ce5e33f2905e82f8435c3650c4\"},\"headline\":\"AI is racist and sexist because we were first\",\"datePublished\":\"2020-08-10T17:13:44+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/ai-is-racist-and-sexist-because-we-were-first\\\/\"},\"wordCount\":1349,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/ai-is-racist-and-sexist-because-we-were-first\\\/#primaryimage\"},\"thumbnailUrl\":\"\",\"keywords\":[\"Ethics\",\"Race\",\"Technology\"],\"articleSection\":[\"The Biz\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/capertongillett.com\\\/blog\\\/ai-is-racist-and-sexist-because-we-were-first\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/ai-is-racist-and-sexist-because-we-were-first\\\/\",\"url\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/ai-is-racist-and-sexist-because-we-were-first\\\/\",\"name\":\"AI is racist and sexist because we were first - Caperton Gillett | The Blog\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/ai-is-racist-and-sexist-because-we-were-first\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/ai-is-racist-and-sexist-because-we-were-first\\\/#primaryimage\"},\"thumbnailUrl\":\"\",\"datePublished\":\"2020-08-10T17:13:44+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/ai-is-racist-and-sexist-because-we-were-first\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/capertongillett.com\\\/blog\\\/ai-is-racist-and-sexist-because-we-were-first\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/ai-is-racist-and-sexist-because-we-were-first\\\/#primaryimage\",\"url\":\"\",\"contentUrl\":\"\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/ai-is-racist-and-sexist-because-we-were-first\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"AI is racist and sexist because we were first\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/#website\",\"url\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/\",\"name\":\"Caperton Gillett | The Blog\",\"description\":\"A blog about advertising, copywriting, creativity &amp;c.\",\"publisher\":{\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/#organization\",\"name\":\"Caperton Gillett Creative\",\"url\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/wp-content\\\/uploads\\\/2024\\\/11\\\/cropped-site-icon.png\",\"contentUrl\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/wp-content\\\/uploads\\\/2024\\\/11\\\/cropped-site-icon.png\",\"width\":512,\"height\":512,\"caption\":\"Caperton Gillett Creative\"},\"image\":{\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/capertongcreative\\\/\",\"https:\\\/\\\/www.linkedin.com\\\/in\\\/acgillett\\\/\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/#\\\/schema\\\/person\\\/a8b294ce5e33f2905e82f8435c3650c4\",\"name\":\"Caper\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/76a8676c5e46f71e04b1e36e7a86c8ed2ad109a4dac4e70e3842a19b1ead7279?s=96&d=mm&r=pg\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/76a8676c5e46f71e04b1e36e7a86c8ed2ad109a4dac4e70e3842a19b1ead7279?s=96&d=mm&r=pg\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/76a8676c5e46f71e04b1e36e7a86c8ed2ad109a4dac4e70e3842a19b1ead7279?s=96&d=mm&r=pg\",\"caption\":\"Caper\"},\"sameAs\":[\"https:\\\/\\\/capertongillett.com\\\/blog\",\"https:\\\/\\\/www.facebook.com\\\/capertongcreative\",\"https:\\\/\\\/www.linkedin.com\\\/in\\\/acgillett\\\/\",\"https:\\\/\\\/x.com\\\/CapeGCreative\"],\"url\":\"https:\\\/\\\/capertongillett.com\\\/blog\\\/author\\\/caper\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"AI is racist and sexist because we were first - Caperton Gillett | The Blog","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/","og_locale":"en_US","og_type":"article","og_title":"AI is racist and sexist because we were first - Caperton Gillett | The Blog","og_description":"Artificial intelligence is a handy tool. It\u2019s also somewhat misleadingly named tool \u2014 as much emphasis as we tend to put on the \u201cintelligence\u201d part of it, it\u2019s the \u201cartificial\u201d &hellip;","og_url":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/","og_site_name":"Caperton Gillett | The Blog","article_publisher":"https:\/\/www.facebook.com\/capertongcreative\/","article_author":"https:\/\/www.facebook.com\/capertongcreative","article_published_time":"2020-08-10T17:13:44+00:00","author":"Caper","twitter_card":"summary_large_image","twitter_creator":"@CapeGCreative","twitter_misc":{"Written by":"Caper","Est. reading time":"7 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/#article","isPartOf":{"@id":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/"},"author":{"name":"Caper","@id":"https:\/\/capertongillett.com\/blog\/#\/schema\/person\/a8b294ce5e33f2905e82f8435c3650c4"},"headline":"AI is racist and sexist because we were first","datePublished":"2020-08-10T17:13:44+00:00","mainEntityOfPage":{"@id":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/"},"wordCount":1349,"commentCount":0,"publisher":{"@id":"https:\/\/capertongillett.com\/blog\/#organization"},"image":{"@id":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/#primaryimage"},"thumbnailUrl":"","keywords":["Ethics","Race","Technology"],"articleSection":["The Biz"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/","url":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/","name":"AI is racist and sexist because we were first - Caperton Gillett | The Blog","isPartOf":{"@id":"https:\/\/capertongillett.com\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/#primaryimage"},"image":{"@id":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/#primaryimage"},"thumbnailUrl":"","datePublished":"2020-08-10T17:13:44+00:00","breadcrumb":{"@id":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/#primaryimage","url":"","contentUrl":""},{"@type":"BreadcrumbList","@id":"https:\/\/capertongillett.com\/blog\/ai-is-racist-and-sexist-because-we-were-first\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/capertongillett.com\/blog\/"},{"@type":"ListItem","position":2,"name":"AI is racist and sexist because we were first"}]},{"@type":"WebSite","@id":"https:\/\/capertongillett.com\/blog\/#website","url":"https:\/\/capertongillett.com\/blog\/","name":"Caperton Gillett | The Blog","description":"A blog about advertising, copywriting, creativity &amp;c.","publisher":{"@id":"https:\/\/capertongillett.com\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/capertongillett.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/capertongillett.com\/blog\/#organization","name":"Caperton Gillett Creative","url":"https:\/\/capertongillett.com\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/capertongillett.com\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/capertongillett.com\/blog\/wp-content\/uploads\/2024\/11\/cropped-site-icon.png","contentUrl":"https:\/\/capertongillett.com\/blog\/wp-content\/uploads\/2024\/11\/cropped-site-icon.png","width":512,"height":512,"caption":"Caperton Gillett Creative"},"image":{"@id":"https:\/\/capertongillett.com\/blog\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/capertongcreative\/","https:\/\/www.linkedin.com\/in\/acgillett\/"]},{"@type":"Person","@id":"https:\/\/capertongillett.com\/blog\/#\/schema\/person\/a8b294ce5e33f2905e82f8435c3650c4","name":"Caper","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/76a8676c5e46f71e04b1e36e7a86c8ed2ad109a4dac4e70e3842a19b1ead7279?s=96&d=mm&r=pg","url":"https:\/\/secure.gravatar.com\/avatar\/76a8676c5e46f71e04b1e36e7a86c8ed2ad109a4dac4e70e3842a19b1ead7279?s=96&d=mm&r=pg","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/76a8676c5e46f71e04b1e36e7a86c8ed2ad109a4dac4e70e3842a19b1ead7279?s=96&d=mm&r=pg","caption":"Caper"},"sameAs":["https:\/\/capertongillett.com\/blog","https:\/\/www.facebook.com\/capertongcreative","https:\/\/www.linkedin.com\/in\/acgillett\/","https:\/\/x.com\/CapeGCreative"],"url":"https:\/\/capertongillett.com\/blog\/author\/caper\/"}]}},"_links":{"self":[{"href":"https:\/\/capertongillett.com\/blog\/wp-json\/wp\/v2\/posts\/838","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/capertongillett.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/capertongillett.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/capertongillett.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/capertongillett.com\/blog\/wp-json\/wp\/v2\/comments?post=838"}],"version-history":[{"count":0,"href":"https:\/\/capertongillett.com\/blog\/wp-json\/wp\/v2\/posts\/838\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/capertongillett.com\/blog\/wp-json\/"}],"wp:attachment":[{"href":"https:\/\/capertongillett.com\/blog\/wp-json\/wp\/v2\/media?parent=838"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/capertongillett.com\/blog\/wp-json\/wp\/v2\/categories?post=838"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/capertongillett.com\/blog\/wp-json\/wp\/v2\/tags?post=838"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}