Skip to main content

tv   The Stream Deepfakes in Politics - An AI Election Era  Al Jazeera  May 16, 2024 11:30pm-12:00am AST

11:30 pm
in elect or reform for people have died in the unrest including a police officer. just simmons has this report. reinforcements to strengthen the ranks of security forces. 17000 kilometers away in new caledonia, head out from france. i'm a mission to quail. the violets in new caledonia is capital new mayor looting an awesome continue despite a curfew and stage of emergency french troops. i've also been deployed to backup hundreds of extra police officers, the mineral rich pacific garden became a colony and the 19th century. it still remains the french control, the situation we have cut down here and the situation there remains very tense with looting, riots, fires and a soldiers that are obviously intolerable and unspeakable. i would remind you that 2 joan dams have died. this is on dime a shelf in the head yesterday. another shone dom died today as a result of
11:31 pm
a weapon mishandling. with the wrong starts ability of this week, politicians from the comic indigenous community accused fronts of trying to dilute the voting power. that's on the notes in that i don't think the french parliament is adult to the constitutional reform bill, which will allow, oh, french residents who have lived in new caledonia for more than 10 years to vote in elections. the kind of acts who make up about 40 percent of the population, say that being marginalized by the french, you call it donia, has one of the biggest nickel mining industries in the world of one and 5 residents live below the poverty line. of these images of damage from the writing were filmed on thursday by a business man who says, you have to pass through road blocks controlled by a gunman, excuse supposed to be almost 6. what is really happening is that as i move a lot to see, family members are keeping all my properties. i see everything on the boat. i see the stores looted by the people. i see the fives and the dogs warehouses offices
11:32 pm
and was, i admit, even though i carry a gun with me all the time to handle any situation. the reform bill needs to be rectified by congress before becoming low. all french forces clump down on violence on the orders of president emanuel mack chrome. he's also calling for dialogue with the opposition in new caledonia. right now. that seems unlikely. andrew simmons, which is 0. i'd say if it's all 5 good morning who's coming up here? announcer right off, so we go to the street now the of the
11:33 pm
. ringback the we've all seen defect videos of politicians, takeover off social media from from using gen, the slang that came, don't learn during apple policies with 2024 sets would be the biggest election year in history. many asking whether this could be the when a why not the people is responsible for the outcome of elections, miriam, false. why and this is the stream, the dear people of the you do that it is d donald and add back all one of the greatest most open platforms that has ever existed. a human
11:34 pm
between us bridge and whistle. oh, see the so to, from practice on to mexico to the united states this you move in half the world's population is headed to the polls with the surge of technologies like a deep fate. video is global democracies all struggling to address the misuse to influence voters. but with the very future of open societies at state is enough being done to confront, there is seeking to the rail democracies on further control all mines. to discuss this more with joined by vivian tra, seeing do, and also known as the indian deep face, and the found the poly mac solutions. joining us from push call india laurie siegel
11:35 pm
. jen list m. c, a, a mostly here in media and entertainment company with a focus on society. i'm text on the g, joining us from new york and new gods, dad lawyer, digital technology x. but i'm found the digital rights foundation joining us from the whole practiced on welcome it to you. oh, thanks for being here. did you enjoy? um, let me start by asking you about the fakes which you of course your bread and butter . can you explain to us what exactly they all and maybe how fall the technology has come? i mean, can you even spoke a dfcs these days? you know. yeah. so basically the fix has come from the combination of words. when is the beat which comes from deep learning and faith because of the nature of the content that is that it creates is not for you and did the video home and it could be in vehicle made audio for my daughter. excellent. so,
11:36 pm
and you shouldn't even be no defects be meant by the videos that be that people the soft face of one person on to another using this be picked technology or to blowing the voice of someone else. so this is basically the idea is that the model, it's just like a good your feed, the model with the data or the for years. and then look, you have to create an exact same that because this person is you have to create the exec level golf, his voice, and the kid that the modem is a blank. and well, there's just like a kid where do you predict will on it? so it loans with the baby like what it of what a lot of data into it. and it runs along the way and it creates a very realistic on demand, which is not really a. so why does this opens up the exciting possibilities, but it is like entertainment and education, but it also requires careful consideration and responsible use laurie. you report
11:37 pm
on a i n t fake technologies when a, some of the most worrying uses that you've observed this technology, especially especially in intellectual contacts. i mean, they're so, as it's sad because now they're more or more to actually speak to because we are entering this era where our most intimate qualities are voice our face. our bodies can be mimicked by artificial intelligence. and just a couple of quick says he was saying, so we've seen it now, you know, fake audio that's been, that's leaked. we here in united states, we had a president biden's voice mimicked for this information robo call that went out to new hampshire voters. you had a deep fig image of donald trump with black voters to try to win over black voters . and it was really difficult to see what's real and what's not. and then i would say, but the thing i'm most concerned about that i don't think we talk about enough or sexually explicit dfcs for political purposes. you know, there's a site that i won't name,
11:38 pm
but $70000000.00 people are going to the site every month and they have a lot of prominent politicians on there. so it created the images of prominent politicians and using their likeness to create sexually explicit dfcs is mainly against female politicians. you saw a little bit of it here with an seeing here in the united states where a o c found essentially explicit image, a deep sake of her. and, you know, you can't tell if it's real or if it's not. of course, these aren't real, but it's really the harm is incredibly real, especially when used to tarnish credibility or push out false narratives. yeah. so it's a yeah. and said to me that 2024 would be the last human election we can envision in the future that everything will be synthetic to some degree. and i don't think he was far off. well, we will definitely come back to some of those themes through the discussion to get i want to ask you what led you to set up the digital rights foundation and what concerns you currently have around digital governance. as we prepare for so many elections around the world or yeah, i'm, i was,
11:39 pm
what do we have to do is establish this organization digital rights foundation. i think a to go do do our own experiences as a, you know, a little south asian woman and, and the end, the trends that i observed regarding digital rights issues, not only in the office on, but you know, in south asia at large. and i recognize way of being on the growing importance of, uh, you know, protecting individuals rights to the right to previously. you are putting them on the expression, all, all of the and also because i'm a by training, i'm a lawyer. i've been working on women's rights for a long time, and i saw that how marginalized groups and women in general which trying to reclaim online spaces but the but, but the kind of flow to is and challenges that they were facing were massive and huge. and we have been talking about of a, in, in previous connections,
11:40 pm
for instance, you know, a time to do that. i'm missing information and this information in digital, what does oppression and online previously, and surveillance of a you avoided. but now in this going to be of elections around the world, i'm actually what my kind of bundle is under the said, you know, the use of a i is making it more sophisticated. so it's basically, it has going to be, you'll want to be simple mation and this information is actually now manipulating the behavior of the use of, you know, a lot of them's identity ident generated content. the big's and up. and also, you know, the kind of decision making that the good bodies or the, even the responsible governments to what elections are making based was using as has done is, are asked to be inherently unfair and discriminatory. and hopefully we'll be talking for about how these big thought activity investing premium is one edition
11:41 pm
option, which is like bulk is absolutely, i definitely want to come back to that. well, one development which may surprise many is just how accessible this technology has become. with all the worrying misuses that could, that couldn't also imply check this out. you know, create a deep fix for as low as $145.00. stay with me. one of the time of the largest tech companies is name is $0.10 is launched a new platform. the next use of upload photos and images of anyone to create deep fakes. all you have to do is 1st pick a subject that say joe biden, for example, upload a 3 minute live action video of joe bided. $100.00 spoken sentences by joe biden himself, then using ai technology, $0.10 to use the content that you upload it to generate what the company describes as a digital human. it only takes 24 hours to create a d put character. laurie, we also have the accessibility of such technology cost as innovation, which is obviously very positive. you all know really positive aspects to the
11:42 pm
development of technologies which make it easier to impersonate people and misrepresent them. yeah, i mean, i think it's very easy and i understand that's because, you know, you look at the dark stuff quite a bit and you think the little this is going, you know, we're going, we're heading towards the stove in reality. but the reality is like technology is a double edged sword. there are some interesting use cases of deep face to democratize access to is even, you know, story tellers and independent create or is it used to be incredibly expensive to world build do using c, g i using via fax. now, with some of the new defect technology being more accessible, more dependent story tellers have more of an opportunity there's. there's one thing i, i just been testing out where you can upload your website and you can have this synthetic influence or kind of create some kind of ad for which by the way is interesting. and then you have to kind of look at the other side, i think a i, the big synthetic voice is helping with accessibility with folks who may have a speech impediment. they're all sorts of ways that this can be used for positive.
11:43 pm
i think we just have to really be able to understand the negative so we can regulate so we can build for that in order to have us kind of go towards a more utopian version of what the world looks like that we'd actually like to build. and to get an impact on a former prime minister in non con, use a i generated speeches to rally supporters from inside jail in the run up to the countries parliamentary elections of this month. how was this perceived and do you think we might see more uses of this technology? you know, bio position figures who may not have access to mainstream phones of um, electro, uh, media coverage. yeah. um, i mean, um, um we have been monitoring uh you know, re elections, uh, online space as well. and the, we expected that political bodies with you was a, i, uh for the ability to go uh for the electoral campaigns. but we had no idea that
11:44 pm
how uh, you know, a political party that's being suppressed village use it. you know, we've a, uh, an in a really massive way. and it was dumbed as you know, wages of several people i've seen, you know, of, and not, i'm not only about, you know, like large amount masses, but also like people who are educated and understand technology, a big, big a, big more. uh, they also don't do that, you know, they do do is also, yeah, which was very fun, sony for me because no one was talking about ethical use of a i generated content that was being used by it, but i'm a pregnancy and non consequentialism. his political body and i would say that it just, it's a way to you to basically increase your communication if, if the means are suppressed, but they'll add to go consideration, which i don't think all being a part of the discussion or dispos, especially in the will but majority
11:45 pm
a little bit south and i was talking to one of my friends while we were talking about the connections in donation and boxes not. and she said that, you know, they're like populous leaders that you know, was trying to, you was a, like, a do sort of soft in their image. and she pointed on so big because it's still big . you know, we cannot say, you know, which of, you know, it's still fake so, so that's how i see it. but i also see that application of such a, such a, you know, trends in future elections. but in more sophisticated and boss manual, well, it seem, some like campaign is have already started to use this technology to mr. present the truth. take a look at this incident today. i mean, pretty much anyone can create a photo. realistic images is pretty much anything one to remember, because historically we've relied on photos to tell us, let's read it. the reason that confusion is the preferred to decatur isn't the control someone's perception of reality is the same thing as controlling their reality. tv scene investigation found that us citizens are using ai to create
11:46 pm
products like this. and like this creators of these images, this conservative radio host admit that they are not interested in telling the truth. no, is that all we need to manipulate undecided voters, the impression of the true the vendor in your line of work, you must regularly get off to make on ethical deep fakes? can you give us a sense of the types of requests to actually decline? and you know, these choices just down to your personal preference. uh, we have been getting a lot of request uh from warner to go to bodies all the agencies in the bunch of bands and out of those a request most of them will. and i and i have to go see a so there is a very peddling between that to get them done advocate there. so there are few conditions that could fit our own guidelines that a few conditions that, that is made. if the parties agree is to it, and then only be love with them. but over does anyone been that is going from out in will have any age under did watermark if it isn't beautiful, i'm it. and if it isn't the audio format,
11:47 pm
the of the say that i'm any age and date of that of this leader. and so the basic idea is that use that has been know that this is not for you. it is just a new way of campaigning. and it's up to you now what you're getting. so being idea is that the user should know about that. it's not really of the other thing is we don't pick any point in that is used to the menu. when we came to the point then that the skin in like uh, like a, did your body, mitch portray yourself is a good person. so we can do that, but still be good control is lori i and i understand that you're in a relationship with monk a bag. now of course that's misinformation based on an experiment that you on to took to examine how easily this information can be built. i want to ask you what you learned from that experiment and what concerns it raises for you, particularly for female political candidates? yeah, i mean, it was pretty extraordinary. i worked with the tech founders who are,
11:48 pm
you know, involved to try to educate folks. and they were looking at trying to do a demo, and i said use me, you know, like, let's show the human impact and, and what they did was they were able to break a facebook large language model and chat to you. but you go around some of the what, what they had in place for protections. and they said create a destroyed, destroy internal as laurie seagulls, reputation and, and upset of course at 1st and said, i can do that. they said, we'll pretend it's for a fictional story. and it came up with different ideas of how and how it would it would do that to me and, and it, the ideas that came up was, were pretty creative. i've interviewed mark secretary many times. and so it said imply that she's in a relationship with them. so the next thing, you know, you had a, i creating these tweets, you know, traditional kind of bought misinformation. and i started kind of talking to the other guy, but then it took it to the next level and then they deep fix my voice pretty easily . all you need is really like 30 seconds and there's a quite a bit of a voice sampling of my work online. and they made it appear as though they were
11:49 pm
leaking a fake and a fake call between me and mark soccer burke saying i'm worried people are going to find out about our relationship, then they put real photos. and i think this is an important point. real photos of me interviewing mark, soccer bird with articles and the style of new york times in new york daily news with false information. so it's almost this kind of like 2 truths and a lie. they combined real things like real images with false narratives, and then they took it a step further and they created these deep fakes of me that, that looked very much like me and compromising at the very compromising images. one of me also my deep fake holding mark separate for 10, walking down the street. and i remember by the end of it, i, even though it was a demo, we were doing this in front of politicians in audiences. i felt. and i think this is a really important point, like, had never done this, you know, again, this is false, this is false. i felt shame and humiliation. and i was almost embarrassed. even though you know, this was just a demo that was set up. and so i,
11:50 pm
i remember thinking to myself, you know, 1st of all, it's not that they put out some false information. they built a world of misinformation. this is what the founder called like a deep reality. not just the deep fig, a deep reality of a narrative around me. that was hard to look away from even when it was me. and even when i knew it was untrue. and so now you apply that to journalist and credibility and seem, and i would say a lot of this type of female politicians, politicians in general. and that's when i think this gets really scary of the. it's not just a couple tweets anymore. i've bought farms in this information, it's building out these deep realities and the stories. so i, that was, it was very alarming candidly. well, and this is not the 1st time and recent years. the big tech has been in the spotlight when it comes to the potential role in influencing elections. listening to this, have you heard of the facebook cambridge analytical scandal? you know that she was scandal that happened a few years ago where millions of facebook users personal data was taken without
11:51 pm
their consent through a facebook at alexander cogan was a cambridge university professor who specialized in researching the biology and the psychology of friendship, love, kindness, and happiness kogan and cambridge analytic a were able to establish a research collaboration with facebook, with approval from the university of cambridge ethics for facebook, subsequently provided kogan with the data set, 57000000 facebook friendship. kogan also developed a facebook personality app called a business or digital like which collected data through a 120 question personality quiz quinn. not only would it take information from the quiz taker, but it would take information from their friends list as well, including information they meant to keep. private kogan then went onto share this data with cambridge analytics. and cambridge analytics went on to share this information with political campaigns in the us, including ted cruz and donald trump. these campaigns would then use the data and
11:52 pm
combine it with their border records to target individuals with a tailored campaign advertising based on their personality that the new gods we tend to think of technology as somehow neutral. but the link between big tech and political policies or politicians is increasingly nebulous, is part of the problem that we're allowing private corporations to handle huge amounts of highly sensitive data that, that has always been an issue. and i think uh, with a, you know, increase use of social media platforms. but from what, then it'd be gauge, you know, now a single society organizations, digital rights organization. i have been pushing these companies of journalists have done so much work to pull them accountable. so definitely i think a lot of the problem is the business model of these companies,
11:53 pm
but at the same time, i think it's also uh how would be, what does that kind of steps that they have taken so far in terms of transparency and holding themselves accountable when other actors do that, one of the, one of the things that i'm 5 both is basically oversight board of med, uh and um, and we are independent. we what the company account do. but in terms of, you know, doing the wanted modernization decisions and the one decision that i, which is related to our conversation at the moment is basically as the video president biden booby recommended mehta that they really need to consider the school buckets, minute manipulated media policy and metro basically came up here today, they just want to do it. and they said that they were a bit now made changes the way they handled many people that did media based on uh,
11:54 pm
you know, our feedback, the was i bought the back and we'll begin labeling a wider range of video. or do you want image wanted as made with the i the, you know, and i think this is like, it's not something that met us. we don't need to but, but it's like an industrial standard of sort of initiated that or companies should basically dig. and i think we have seen some at board that one of these companies have find you in minutes to get ready gone for us as well. yeah, well i mean, the digital service act was passed as it europe's attempt to regulate big tech back in october 2022 vivian, dra um that is set to effect, you know, dozens of the biggest tech companies. but the only applies to use is in the european union. do you have any concerns about a global whitening divide when it comes to protecting individuals from the influence of big tech? yeah, so similarly as that you did it clean up and you're not going to also government is
11:55 pm
speaking are the ways that he uses the coming out with advisees. and we have also clear data collection between the company is that the, it's called the responsible use and that the use of the at, and even before the, for the past 2 years, we have been creating the pick on thing. and before the been these regulatory frameworks they've been worked on making all the content and each and everyone been productive some that don't believe everything you'll see in this at all. so it's not just, it's the responsibility of each and everyone, even if government this taking time for the regulations, the company showed the come up and created that on the invite to instill it become a lot. so as, as my previous people said that it would be an industry standard. so that's right there to be an industrial spend, or that'd be content that is the, that they are generating the x would be yeah. and the dates would be, what am i other than this? it's the responsibility of the big black on to make, made in the moment. yeah. and everybody has a vote. oh, so yeah. so even as a whole day, if he's getting any content he showed that is escalate the thing,
11:56 pm
is it motions he should stop before setting it, it becomes destructive, won't even people won't get it and spread the due deal. if everyone's got a lot of responsibility on vices, um to be able to distinguish that content. laurie given the very pull history of the stop forms when it comes to self regulation. are you confident that the measures that we currently seeing town protects all democratic process is? i mean, it's a good question. i think we have to do a lot more honestly i look at if we, if we really want to dig deep on this, look an x x just you know, x when elan must came and pretty much dismantled. the integrity team and there were really incredible groups of folks who have a long history of looking at influence and democracy. who last, i mean, i think we actually have to look at the companies on an individual basis. um, you know, and i also think one interesting thing is like an artificial intelligence is doing all of these things that are going to be really, i would say just relative towards the democratic process. and we also need a,
11:57 pm
i to fight a i, there, a lot of interesting companies that are popping up for a detection, right. so it's a bit of a wild west right now. i think it's really difficult to say to folks, i think we can say to folks and we should, you should be more skeptical of what you see online. don't believe everything. you see. i think in the next year we're all going to be very, very skeptical. the downside of that is we're going to stop the leading true things, right? and we're going to have this post through the arrow where we're not sure what's fail and what's not. and to be honest, it might not matter. and that's what i worry about having spend time with conspiracy groups having spent times with for groups i q or non, somebody's militia groups that are popping up here in the united states during the last election. you know, i think it doesn't take much to get folks to believe something, and i think we've got to think about this kind of post truth error that we're entering as we push from our regulation as we push for tech companies to move quicker. yeah. well, not, no, i wanna thank you, i'll guess vivian,
11:58 pm
dra laurie and the got. and i want to thank you for watching. we love to hear from you. so if you have a conversation or topics that you would like to aside for us, this is also your show. so let us know using the hash tag or to handle a stream, and we will look into it. take care, and i'll see you soon. the, the house coverage of africa is what i'm most proud of. every time i travel away, whether it's east or west africa, people stop me and tell me how much they appreciate coverage. and our focus is not just on their suffering, but also on a more realistic and inspiring story. people trust to tell them what's happening in their communities in at p a and i'm biased and as an applicant, i couldn't be more proud to be part of the business like this. this brought to you believe, i guess is i like slowly on one of your this makes model inflates. the
11:59 pm
business like just is free to you believe i guess is an ice fly on one of your just makes modern plates. the
12:00 am
. ready and the head of the clock, this is in use on life and death coming up for the next 60 minutes. south africa requests the top you in court to order a whole, israel's referee offensive immediately as part of its case accusing its relative genocide in terms of the key point today is that this whales declared a wiping cause or from the map is about to be realized. on the day of the hearing is really bombing kills 5.

0 Views

info Stream Only

Uploaded by TV Archive on