ずくなしの冷や水

2020年01月10日

さて テヘラン近郊でウクライナ機を撃墜したのは誰だ

2020/1/11、この事故はイランが誤って撃墜したと発表しています。

テヘラン近郊でのウクライナ機の墜落は、外部からの何らかの力が加わったためという見方が強まっています。カナダのトルドゥがイランがやったとの複数のインテリジェンス情報があると述べ、イランが激しく反発。

イランの誤爆の可能性はあると思いますが、米国の言いぶり、トランプが取り乱していたこと、などを考え合わせると米国がやった可能性も少し強まりました。

シリアンガールは、米国を疑っているようです。



※ Syrian Girl@Partisangirl
The US were informed via spy channels that Iran was about to retaliate.
So they told their assets to shoot down a civilian flight, perfectly film it, then send the film to NYT.
It's a message, they shot down an Iranian plane in 1988.
#IranPlaneCrash

※ Damian Vigorito@damian_vigorito
Who's filming a video before the missile hits it and turning over to the NY Times.
How american not iranian





※ Arabi Souriعربي سوري@3arabiSouri氏の2020/1/10のツイート
Who usually stands int he middle of nowhere at the dense of the night filming the air, coincidentally where a missile will fire up and shoot a plane going down... Iran still says there's no missile, so could be something else.

※ Syrian Girl@Partisangirl氏の2020/1/10のツイート
Indeed it was 4am
posted by ZUKUNASHI at 22:56| Comment(2) | 国際・政治

Terrifying or nothing to fear? Apple admits to scanning user photos, presumably only to hunt child abuse

RT2020/1/9
Terrifying or nothing to fear? Apple admits to scanning user photos, presumably only to hunt child abuse
Apple has confirmed that it scans user images in an effort to detect evidence of child abuse, but the company has revealed little about how the scans work, piquing concerns about data privacy and the reach of intrusive tech firms.

While it’s unclear when the image scans started, Apple’s chief privacy officer Jane Horvath confirmed at an event in Las Vegas this week that the company is now “utilizing some technologies to help screen for child sexual abuse material.”

Apple initially suggested it might inspect images for abuse material last year – and only this week added a disclaimer to its website acknowledging the practice – but Horvath’s remarks come as the first confirmation the company has gone ahead with the scans.

A number of tech giants, including Facebook, Twitter and Google, already employ an image-scanning tool known as PhotoDNA, which cross-checks photos with a database of known abuse images. It is unknown whether Apple’s scanning tool uses similar technology.

The move sent off alarms for critics, some disputing Apple’s sincerity in its avowed desire to crack down on crime, as well as whether the photo scans will further erode the privacy of consumers, especially given the scant detail the company has so far offered about the screening process.

“Of course everyone is for stopping child abuse, and that’s not the issue here. It’s that I’m simply not buying it,” journalist and political commentator Chadwick Moore told RT. “I don't believe that Apple really cares about fighting crime.”

Moore said the company has been excessively vague about the scans, noting “all it says is that they can scan all your images, flip through all your data, and look for potentially illegal activity, including child pornography.”

What does that mean? That’s terrifying language. What else are they looking for? If you’re smoking a joint, is that next? I don’t trust these companies, I just think it’s ever-encroaching more and more into our privacy, into owning our data.

Tech expert and privacy advocate Bill Mew said the critics are wrong, however, arguing that the new measure may be less intrusive than it appears given Apple’s technological capabilities.

“The technology that is in use is really clever,” Mew told RT. “It doesn’t necessarily mean that Apple can actually see your photos,” as the company can “sift through these images and test them against a set of known ‘fingerprints’ ... without actually de-encrypting the images themselves.”

Therefore, there’s little to fear on the privacy front.

While Apple has gone to bat for data privacy in the past – on several occasions tussling with law enforcement agencies seeking access to one of the company’s devices – its track record on the question in somewhat mixed. In August, it was revealed that company contractors were granted access to customers’ private conversations through Apple’s AI assistant program, Siri, in an effort to “grade” its performance. Several other tech giants have come under fire for similar intrusions, with both Google and Amazon’s home assistant devices also found to surreptitiously record users.



posted by ZUKUNASHI at 20:47| Comment(0) | デジタル・インターネット