Skip to main content

tv   The Stream Deepfakes in Politics - An AI Election Era  Al Jazeera  May 17, 2024 8:30am-9:01am AST

8:30 am
which of 11 women are killed every day in the country, faced with endless hurdles to just as relatives often find other ways to honor their loved ones. many families, higher artist to pain their names and faces on city streets as a reminder of their tragic deaths. these are the final touches on my deal. ideas is mural. she disappeared 6 years ago for sisters overseeing the artwork and hoping her country can move away from this terrible trend that is going way. yeah, it's very hard to have trust of to what's happened to our family, but it's good to give a bit of confidence to the income and government. we hope more attention is given to these cases. to this issue, we want to see a commitment from our government, but many don't say mexico kind of for to just wait for legislators or even a female president to change things. the henderson support group brings together men with violent tests to try to help them to redefine what it means to be
8:31 am
a man. the nightly meetings they talk about accepting their violent behavior. and crucially developing techniques to prevent any future outbursts. these men have at least taking the 1st step by recognizing they were part of the problem. the urgent challenge now is for the rest of mexican society to stop normalizing violence against women and girls. diego, yeah, i know, i'll just 0 mexico city. at least 4 people have died in the us state of texas and heavy storms. high speed winds down trees and power lines in houston, leaving about 800000 people without electricity. the windows of several high rise buildings were blowing out. the public safety department has issued flood warnings for texas and louisiana. this donald trump's defense team has tried to undermine
8:32 am
his former lawyer, michael co injuring cross examination of him in terms harsh money trial. the former us president, his lawyers forced co in to admit he lied under oath. in previous cases, trump, he did not guilty to study for felony counts for allegedly false fine business records to cover up payments to an adult film star co and says he made those cash transfers that trumps requests to protect the 2016 election campaign. and prosecutor is in south africa say for my present jacobs, whom i will go on trial for a ledge corruption in april next year. so my face is multiple counts of corruption as well as rockets hearing tax evasion, and money laundering us won't be charged 4 years ago with taking bribes in connection with a controversial, multi $1000000000.00 friend john steele. destiny. so now the stream is, makes the a lot of the stories that we cover of heidi complex. so it's very important that we
8:33 am
make them as understandable as we can. as long as you say, we're correspondence, that's what we strive to do. we've all seen defect videos as politicians take carver off social media from from using gen. the slang that came, don't learn during apple policies with 2020 full set to be the biggest election year in history. many a asking whether this could be the when a why not the people is responsible for the outcome of elections? marion false. why this is the street, the dear people of the do that it is the donald and i back all one of the greatest most open platforms that has ever existed. a human between us bridge and whistle. oh,
8:34 am
see the so to, from practice on to mexico to the united states this year, more than half the world's population is headed to the polls with the surge of technologies like a deep fate video is global democracies are struggling to address the misuse to influence voters but with the very future of open societies at stake is enough being done to confront, there is seeking to the rail democracies on further control all mines. to discuss this more with joined by vivian tra, saying do, and also known as the indian, deep faith and the found the poly math solutions. joining us from push call india laurie siegel. jen list m. c, a, a mostly here in media and entertainment company with
8:35 am
a focus on society. i'm text on the g, joining us from new york and new get dad, lawyer digital technology x. but i'm found the digital rights foundation joining us from the whole practiced on welcome it to you. oh, thanks for being here. if you enjoy, um, let me start by asking you about the facts which you of course your bread and butter. can you explain to us what exactly they all and maybe how fall the technology has come? i mean, can you even spell a defect these days? you know. yeah. so basically the fix has come from the combination of words. when is the beat which comes from deep learning and faith because of the nature of the content that is, that it creates is not for you and did the video home. and it could be rebuilt for my audio for my daughter, text on me. so, and you shouldn't even be no defects be meant by the videos that we read people's soft fingers off one plus on, on to another,
8:36 am
using this the 5th technology or to blowing the voice of someone else. so this is basically, the idea is the a model, it's just like a kid, you feed the model with the data or the 5th and fed him look, you have to create an exact same difficult this plus on, on a, you have to create the exit because i play golf and his wife and the kid that the modem is a blank and well, there's just like a kid where do you pay the table on it? so it loans with the baby like what a lot of data into it. and it runs along the way and it creates a very realistic on demand, which is not really a. so why does this opens up an exciting possibilities, but it is like entertainment and education, but it also requires careful consideration and responsible use laurie. you report on a i n t fake technologies when a, some of the most worrying uses that you've observed of this technology. especially
8:37 am
especially in intellectual contacts. i mean, there's so, as it's sad because now there are more, more to actually speak to because we are entering this era where our most intimate qualities are voice our face. our bodies can be mimicked by artificial intelligence . and just a couple of quick, since he was saying, so we've seen it now, you know, fake audio that's been, that's leaked. we here in the united states, we had a president biden's voice mimicked for dis, information robo call that went out to new hampshire voters. you had a deep fig image of donald trump with black voters to try to win over black voters . and it was really difficult to see what's real and what's not. and then i would say, but the thing i'm most concerned about that i don't think we talk about enough or sexually explicit dfcs for political purposes. you know, there's a site that i won't name, but $70000000.00 people are going to the site every month. and they have a lot of prominent politicians on there. so it created the images of a prominent politicians and using their likeness to create sexually explicit dfcs
8:38 am
is mainly against female politicians. you saw a little bit of it here with an seeing here in the united states where a o c found essentially explicit image, a deep sake of her. and, you know, you can't tell if it's real or if it's not. of course, these aren't real, but it's really the harm is incredibly real, especially when used to tarnish credibility or push out false narratives. yeah. so it's a yeah. and said to me that 2024 would be the last human election we can envision in the future that everything will be synthetic to some degree. and i don't think he was far off. well, we will definitely come back to some of those themes through the discussion to get i want to ask you what led you to set up the digital rights foundation and what concerns you currently have around digital governance. as we prepare for so many elections around the world. a yeah, i'm, i was, what do we have to do the establish this organization, digital rights foundation. i think a to go uh do do our own experiences as of you know,
8:39 am
a little south asian woman and, um, and the, and the trends that i observed regarding digital rights issues. not only in the office on, but you know, uh inside of the, uh, at large. and i recognize way of being on um, the growing importance of, uh, you know, protecting individuals rights to the right to previously your freedom of expression or all of that. and also because i'm a buy training, i'm a lawyer, i've been working on women's rights for a long time. and i saw that how marginalized groups and women in general uh which trying to reclaim online spaces. but the got but the kind of flow to it and challenges that they were facing were massive and huge. and we have been talking about of a, in, in previous elections for instance, you know, a time. and is there a missing formation and this information in digital? what does oppression and online previous e and surveillance of a you a waters?
8:40 am
but now in this going to be of elections around the world, i'm asking you what my federal bundle is already said. you know, the use of a, i is making it more sophisticated to it's basically it has going to be, you'll want to just information in this information and asked and you know, manipulating the behavior with the use of, you know, a lot of them's identity ident generated content, the bigs and, and also you know, the kind of decision making that the good bodies or the, even the responsible governments to what elections are making based was using as has done is, are asked to be inherently unfair and discriminatory. and hopefully we'll be talking for about the how dfcs are actually investing premium is the one edition option, which is like bulk is absolutely, i definitely want to come back to that. well, one development which may surprise many is just how accessible this technology has
8:41 am
become. with all the worrying misuses that could, that couldn't also imply check this out. you know, create a deep fix for as low as $145.00. stay with me. one of the time of the largest tech companies is name is $0.10 is launched, a new platform. the next use of upload photos and images of anyone to create deep fakes. all you have to do is 1st pick a subject that they don't live in. for example, upload a 3 minute live action video of joe bided. 100 spoken sentences by joe biden himself. then using ai technology, 10 cent. we'll use the content that you upload it to generate what the company describes as a digital human. it only takes 24 hours to create a defect tractor. lori, we also have the accessibility of such technology cost as innovation, which is obviously very positive. you all know really positive aspects to the development of technologies which make it easier to impersonate people and misrepresent them. yeah, i mean,
8:42 am
i think it's very easy and i understand that's because, you know, you look at the dark stuff quite a bit and you think the little this is going, you know, we're going, we're heading towards the stove in reality. but the reality is like technology is a double edged sword. there are some interesting use cases of deep face to democratize access to is even, you know, story tellers and independent create or is it used to be incredibly expensive to world build do using c, g i using via fax. now, with some of the new defect technology being more accessible, more dependent storytellers have more of an opportunity. there's. there's one thing i, i just been testing out where you can upload your website and you can have the synthetic influence or kind of create some kind of ad for which by the way is interesting. and then you have to kind of look at the other side. i think a i, the big synthetic voice is helping with accessibility with folks who may have a speech impediment. they are all sorts of ways that this can be used for positive . i think we just have to really be able to understand the negative so we can regulate so we can build for that in order to have us kind of go towards
8:43 am
a more utopian version of what the world looks like that we'd actually like to build. and to get in, in pakistan on a former prime minister in non con, use a i generated speeches to rally supporters from inside jail in the run up to the countries parliamentary elections of this month. how was this perceived and do you think we might see more uses of this technology? you know, bio position figures who may not have access to mainstream phones of um, electro, uh, media coverage. yeah. um, i mean, um we have been monitoring uh you know, re elections, uh, online space as well. and the, we expected that political bodies will use a, i, uh, for the ability to go uh for the electoral campaigns. but we had no idea that how uh, you know, a political party that's being suppressed village use it. you know, we've a, uh, an in every you, massive way. and it was dawned as you know,
8:44 am
wages of several people i've seen, you know, of, and not i'm not only about, you know, like lodge my masses, but also like people who are educated and understand technology, a big, big a, big more. uh they also don't do that in a way to do is also yeah, which was very fun, sony for me because no one was talking about ethical use of a i generated content that was being used by it. but i'm a pregnant same non consequentialism. his political body and i would say that uh yes, it's a way to you to basically increase your communication if, if the domains are suppressed, but they'll add to go consideration, which i don't think all being a part of the discussion or dispos, especially in the bull by majority a little bit south and i was talking to one of my friends while we were talking about bangladesh, connections and donation. and fox is not. and she said that, you know, they're like populous leaders that you know, who are trying to you was
8:45 am
a like a do sort of soft in their image. and she pointed on so big because it's still big . you know, we can not say in a way to, you know, it's still fake so, so that's how i said, but i also see the replication of such a, such a you know, trends in future elections. but in more sophisticated and mos manual. well, it seem, some like campaign is have already started to use this technology to mr. present the truth. take a look at this incident today. i mean, pretty much anyone can create a photo. realistic images is pretty much anything going to remember, because historically we've relied on photos to tell us what's the reason that confusion is the preferred to decatur is controlling someone's perception of reality is the same thing as controlling their reality. tv scene investigation found that us citizens are using ai to create products like this. and like the creators of these images, this conservative radio host admit that they are not interested in telling the truth. no, is that all we need to manipulate undecided voters,
8:46 am
the impression of the true the vendor in your line of work, you must regularly get off to make on ethical deep fakes? can you give us a sense of the types of requests to actually decline? and you know, these choices just down to your personal preference. uh, we have been getting a lot of request uh from warner to go to bodies all the agencies in the body to live on sometimes and out of those a requests. most of them will. and i, and i have to go see a, so there is a very piddling between that to go in and dynamic there. so there are few conditions that could fit our own guidelines. there are few conditions that that is made if are pointed to the bodies agrees to it and then only the above with them one over does anyone that is going from out in will have any age under did watermark if it isn't beautiful make and if it doesn't eat or do form at the of thought will say that i'm any age and date of the, of this leader. and so the basic idea is they use that has to know that this is not for you. it is just
8:47 am
a new way of campaigning. and it's up to you now what you're getting. so being idea is that the user should know about it is not really of the other thing is we don't pick any point in that is used to give him anyone. we can 3 and then that the can in like, uh, like a, did your body minutes portray yourself as a good person. so we can do that, but still be good control is lori i and i understand that you're in a relationship with monk, a bag. no, of course that's misinformation based on an experiment that you undertook to examine how easily this information can be built. i want to ask you what you learned from that experiment and what concerns it raises for you, particularly for female political candidates? yeah, i mean, it was pretty extraordinary. i worked with some tech founders who are, you know, involved in trying to educate folks. and they were looking at trying to do a demo. and i said, use me, you know, like, let's show the human impact and, and what they did was they were able to break a facebook,
8:48 am
large language model and chat to you. but you go around some of the what, what they had in place for protections. and they said, create a destroyed destroy journal as laurie seagulls, reputation and, and upset of course at 1st and said, i can do that. they said, we'll pretend it's for a fictional story. and it came up with different ideas of how and how it would it would do that to me and, and it, the idea that came up was, were pretty creative. i've interviewed mark sacker bird many times. and so it said imply that she's in a relationship with them. so the next thing, you know, you had a, i creating these tweets, you know, traditional kind of bought misinformation. and i started kind of talking to the, the other guy, but then it took it to the next level and then they deep fix my voice pretty easily . all you need is really like 30 seconds and there's a quite a bit of a voice sampling of my work online. and they made it appear as though they were leaking a fake and a fake call between me and mark soccer burke saying i'm worried people are gonna find out about our relationship,
8:49 am
then they put real photos. and i think this is an important point. real photos of me interviewing mark, soccer bird with articles and the style of new york times in new york daily news with false information. so it's almost this kind of like 2 truths and a lie. they combined real things like real images with false narratives, and then they took a step further and they created these deep fates of me that, that looked very much like me and compromising at the very compromising images. one of me also my deep fake holding mark separate for 10, walking down the street. and i remember by the end of it, i, even though it was a demo, we were doing this in front of politicians in audiences. i felt. and i think this is a really important point, like, had never done this, you know, again, this is false, this is false. i felt shame and humiliation. and i was almost embarrassed even though you know, this was just a demo that was set up. and so i, i remember thinking to myself, you know, 1st of all, it's not that they put out some false information. they built a world of misinformation. this is what the founder called like
8:50 am
a deep reality. not just the deep fig, a deep reality of a narrative around me. that was hard to look away from even when it was me. and even when i knew it was untrue. and so now you apply that to journalist and credibility and seem, and i would say a lot of this type of female politicians, politicians in general. and that's when i think this gets really scary of the. it's not just a couple tweets anymore. i've bought farms in this information, it's building out these deep realities and the stories. so i, that was, it was very alarming candidly. well, and this is not the 1st time in recent years. the big tech has been in the spotlight when it comes to the potential role in influencing elections. this is of this, have you heard of the facebook cambridge analytical scandal? you know that she was scandal that happened a few years ago where millions of facebook users personal data was taken without their consent through a facebook at alexander cogan was a cambridge university professor who specialized in researching the biology and the
8:51 am
psychology of friendship, love, kindness, and happiness kogan and cambridge analytical, were able to establish a research collaboration with facebook. with approval from the university of cambridge ethics board, facebook subsequently provided kogan with the data set, 57000000 facebook friendship. kogan also developed a facebook personality app called a business or digital like which collected data through a 120 question personality quiz quinn. not only would it take information from the quiz taker, but it would take information from their friends list as well, including information they meant to keep. private kogan then went onto share this data with cambridge analytics. and cambridge analytics went on to share this information with political campaigns in the us, including ted cruz and donald trump. these campaigns would then use the data and combine it with their border records to target individuals with a tailored campaign advertising based on their personality that the new gods we
8:52 am
tend to think of technology as somehow neutral. but the link between big tech and political policies or politicians, is increasingly not. this is part of the problem that we're allowing private corporations to handle huge amounts of highly sensitive data that that has always been an issue. and i think i would um, you know, increase use of social media platforms, but for more than a decade, you know, now a single society organizations, digital rights organization has been pushing these companies, but journalists have done so much work to pull them accountable. so definitely i think a part of the problem is the business model of these companies, but at the same time, i think it's also uh how would be, well,
8:53 am
the kind of steps that they have taken so far in terms of transparency and holding themselves accountable. when other actors do that, one of the, one of the things that i'm 5 off is basically oversight board of med, uh and um, and we are independent rewards the company account. but in terms of, you know, doing the want in modernization decisions and the one decision that i, which is related to our conversation at the moment is basically as the video president biden booby recommended mehta that be the need to consider the school. but if it's minute manipulated media policy and metro basically did mob here they, they just want to do it. and they said that they were a bit now made changes the way they had the media based on uh, you know, our feedback. but it was, i bought feedback and we'll begin labeling a wider range of video or the one image wanted as made. we live at, you know,
8:54 am
and i think this is like, it's not something that met us. we don't do but, but it's like an industrial standard of sort of initiated that or companies should basically dig. and i think we have seen some of the board that all these companies have find you in minutes to get ready gone for us as well. yeah. well, i mean, the digital service act was passed as europe's attempt to regulate big tax back in october 2022, vivian, dra um that is set to effect, you know, dozens of the biggest tech companies. but they only applies to use in the european union. do you have any concerns about a global whitening divide when it comes to protecting individuals from the influence of big tech? yeah, so similarly as that you that actually in our country also government has figured ways that he uses the coming out with advisees. and we have also peer data
8:55 am
collection between the company is that the, it's called the responsible using the use of the at and even before the, for the past few years, we have been creating the pick on pen and before the been these regulatory frameworks, the have been working, working all of the content and each and everyone been absent that don't believe everything you'll see in this at all the eyes. so it's not just, it's the responsibility of each and everyone, even if bowman this day and time for the regulations. the company showed to come up and created that on the invite to instill it become a lot. so as, as my previous people said, that it would be an industry standard. so that's right there to be an industrial spend or that'd be content. that is the, that they are generating the x would be yeah, just the dates will be one to my other then there's, it's the responsibility of the big black on like maybe the end, the bowman. yeah. and everybody as a board though. so yeah. so even as the whole day, if he's getting any content he showed that is escalate thing, his emotions, he should stop before setting it, it becomes destructive, won't even people won't get it, then spread the due deal. if everyone's got a lot of responsibility on vices. um,
8:56 am
to be able to distinguish that content. laurie given the very pull history of these platforms, when it comes to self regulation, are you confident that the measures that we currently seeing can protects all democratic process is? i mean, it's a good question. i think we have to do a lot more honestly. i look at if we, if we really want to dig deep on this, look an ex, uh ex, just, you know, ex, when you on most came in pretty much dismantled the integrity team. and there were really incredible groups of folks who have a long history of looking at a influence in democracy. who last, i mean, i think we actually have to look at the companies on an individual basis. um, you know, and i also think one interesting thing is like an artificial intelligence is doing all of these things that are going to be really, i would say just dropped it towards the democratic process. and we also need a, i to fight a either a lot of interesting companies that are popping up for a detection. right. so it's a bit of
8:57 am
a wild west right now. i think it's really difficult to say to folks, i think we can say to folks and we should, you should be more skeptical of what you see online. don't believe everything. you see. i think in the next year we're all going to be very, very skeptical. the downside of that is we're going to stop the leading true things, right? and we're going to have this post true the arrow where we're not sure what scale and what's not. and to be honest, it might not matter. and that's what i worry about having spent time with conspiracy groups having spent times with pro groups i q and on some of these militia groups that are popping up here in the united states during the last election. you know, i think it doesn't take much to get folks to believe something, and i think we've got to think about this kind of post through the air that we're entering as we push from our regulation as we push for tech companies to move quicker. yeah. well, not, no, i want to thank you, i'll guess vivian, dra, lori, and the gods. and i wanna thank you for watching. we'd love to hear from you. so if you have a conversation or topics that you would lights aside for us,
8:58 am
this is also your shirts. so let us know using the hash tag or to handle h a stream, and we will look into it, take care and i'll see you see the how the sinews, who were expelled from their nuns, and deductible of 1948. still don't have the rights of return that is out of the land was extorted, and settlements were built. 20 houses, 0 barrels goes back with the young, good old palestinians to re discover their ancestral homes. why doesn't my grand parents stay here? why aren't i here? return to palestine on al jazeera, the latest news as it breaks this year's march. it's a mess. it's not just for the card government but also for the incoming ones that they're not going away until they know what happened to their loved ones with
8:59 am
detailed coverage, millions of pharma. now how rich thing that we'd cross by day, whether they may not get a good price because of mismanagement of weight import from around the world for prizes rose sharply. yeah. inside with neighboring identity. yeah. restricted expos . the
9:00 am
battle is ray john in no ven, guys fighting continues in giovanni at with civilian casualties rising while in the south of the gaza strip. israel's defense minister announces more ground forces are being deployed to rough. hello. this is algae 0 live from jo. um for the back people also coming up israel is set to defend itself at the un stop.

0 Views

info Stream Only

Uploaded by TV Archive on