- UK: Plea for Scotland Yard unit to tackle trolls as police forces face rising tide of online hate crime
- Google calls for anti-Isis push and makes YouTube propaganda pledge
- On Web, white supremacists stir up a growing and angry audience
- Czech-language FB page of non-existent group falsifies photo of banner carried by Romani people
- EU ruling holds website responsible for offensive user comments
- Belgian privacy watchdog sues Facebook
- Group Says Only One Third of Antisemitic Material Removed From YouTube, Facebook
- A Canadian Muslim Group Has Launched a Hate Tracking Website
- We will see more antisemitism, violence and terrorism. We should act now! (interview)
- USA: Half of Democrats support a ban on hate speech
- Google Maps’ cyber-racism hack locates White House after ‘N***a house’ search (opinion)
- Russia threatens to ban Google, Twitter and Facebook over extremist content
- Canada: Police investigation into racist Facebook pages continues
- UK: Police's '24-hour window' to halt cyber-hate after terrorist event
- UK: Kids 30 times more likely to be victim of hate crimes over internet
- UK: Portsmouth City Council criticised over racism investigation into Facebook posts
- Germany: Dangers that lurk on the Internet
- Google Says You Can Do What It Can't: Beat Online Anti-Semitism
- Cyberhate, anti-Semitism discussed at Jerusalem forum
- Israel Jails Palestinian Who Applauded Militant Attacks on Facebook
- USA: Study finds link between N-word Google searches, black mortality
- USA: How YouTube and FB ruined this principal’s chances of denying her racism
- Austria: Child abuse images deface Nazi Mauthausen camp website
- USA: Mother speaks out about Cyber-bullies
- Sexism and Racism in Video Games
- Steam Greenlight highlights hate-crime game, 'Kill the F*ggot'
- UK: Students schooled in cyber safety
- White supremacists stole my identity to spew hatred on the Times of Israel
- Czech Rep: Report: Last year saw rise in anti-Semitic incidents, on internet by 20%
- UK: Ku Klux Klan racist jailed again for far right rants on Facebook
- Czech Rep: Brno university opens centre for cyber attack defence training
- USA: Ignoring hate speech won't silence it (opinion)
- Poll: Most Canadians ignore hateful, racist internet speech
- WJC Latin America launches website to combat Holocaust denial
- YouTube Creators Questioned About Racism (opinion)
- Italy: Facebook blocks Salvini over 'gypsies' slur
- EU spends millions to make next Facebook European
- Facebook tracking said to breach EU law
- UK: Troll who abused disability campaigner reported to police
- UK: Newport man fined for posting racist comments on Facebook
- Slovakia: Government taking note of online extremism
- EU commissioners at odds over geo-blocking
- Canada: Facebook page targeting Winnipeg aboriginal people pulled down
- Website blocking in France; other anti-terrorist legislation in OSCE countries may curb free expression
- Why Facebook is finally baring all over social media standards
25/6/2015- Scotland Yard is failing to tackle internet trolls and should become the centre for a new national specialist unit to stop other forces being “overwhelmed”, a report today warns. It says police in England and Wales lack the specialist skills needed to deal with online hate crime as reported offences soar. The report says that in London online hate crime is “not specifically budgeted or resourced” for, while police methods vary greatly across boroughs and are “insufficient”. The failure of all forces to tackle the issue is allowing trolls to “act online with impunity and has fostered a breeding ground for hate crime”. The report, due to be published today but seen in advance by the Standard, also calls for the Government to introduce a “stand-alone offence” for online attacks, making it easier to prosecute offenders. It points to figures showing that last year the Met received 1,207 crime reports in which Facebook was mentioned, a 21 per cent rise in two years, while the number involving Twitter increased by 19 per cent.
The report, which surveyed hundreds of alleged victims, suggests that only nine per cent of alleged online hate crimes nationwide are investigated. London Assembly member Andrew Boff, author of the report, said: “Victims are left feeling isolated by online hate attacks and often feel like there is nobody to turn to. “They feel police can’t be turned to because they are overwhelmed by the number of cases and are unable to provide the level of support somebody would expect. If the police can’t help, who do they go to? That’s why this unit is needed. We’re talking about pretty appalling hate crimes.” Although it is a problem for all forces in England and Wales, the report says that the unit of IT specialists should be housed within Scotland Yard and coordinate a nationwide response. It would act as a “point of liaison” with internet service and social media providers, with each force contributing funding.
The report, #ReportHate: Combating Online Hatred, says: “With online hate crime on the rise and draining police force resources, the proposed unit is necessary to alleviate the burden currently faced by police officers. Additionally, it would create a better service to victims of hate crime online.” It says existing legislation is “muddled” and “obsolete”, and calls for new laws to create specific offences relating to online hate crime. It adds that reported abuses are “just the tip of an ever-growing iceberg”, with only 16 per cent of alleged victims in London surveyed by researchers saying they reported it to police. A Met spokesman said: "The Metropolitan Police Service is committed to tackling hate crime in all its forms, and has long since recognised the impact of hate crime on communities and the hidden nature of this crime, which remains largely under-reported. "We take positive action to investigate all hate crime allegations, support victims and their families and bring perpetrators to justice. We are always seeking ways to further improve our response to hate crime and increase reporting, and are willing to consider alternative ways of enhancing our investigative response and victim support, working closely with our partners.
"The Met's recently formed hate crime senior partnership group focuses on creating and delivering an effective hate crime operational strategy for London. This is being developed in partnership with strategic and community partners, demonstrating our ongoing commitment to reducing the harm caused by hate crime and increase the confidence of victims. "In addition, the Met's online hate crime working group has been set up to respond specifically to online hate crime and explore ways to tackle the issue. As part of this work, the group will consider the publication of the London Assembly report and the suggestions made. "If anyone feels that they are the victim of hate crime, we would urge all victims to come forward and report any incident or crime as soon as possible. "All 32 London boroughs have a dedicated Community Safety Unit (CSU) with more than 500 specially trained officers across the Met who investigate hate crime and domestic abuse."
‘I believe they are out of their depth’
Suzanne Fernandes, a youth worker from west London, was sent racially offensive material on Twitter, including photographs, pornography and death threats. The troll, who cannot be named for legal reasons, also obtained a picture of her young son through Facebook and used it to create a fake account sending lurid tweets. She told the Standard: “I’m not the same person. I felt completely panic-stricken, and when they used my son’s picture it was the breaking point. I had to get counselling. “I decided to report the matter to the police and in the early days of the investigation was given the impression I was wasting the police’s time. They are out of their depth. I feel there should be a dedicated unit for victims.”
© The London Evening Standard
Executives vow video site will not be used as a platform for ‘brutally violent propaganda produced by terrorists’, but argue against blanket censorship
24/6/2015- Google has issued a call to arms against Isis, arguing that the terror group has engineered a “viral moment” on social networks with propaganda and beheading videos that needs to be challenged. Two of Google’s top executives – legal chief David Drummond and policy director Victoria Grand – used the Cannes Lions advertising festival to launch an attack, and appeal, against terrorist propaganda on Google-owned YouTube. “Isis is having a viral moment on social media and the countervailing viewpoints are nowhere near strong enough to oppose them,” said Grand. “Isis, in particular, has been putting up footage that is inhuman and atrocious. We are still seeing about two or three of these beheadings each week. They are heeding advice from a decade before from Osama bin Laden and they are taking it to another level using social media.”
Drummond, a lead figure in the internet company’s battles with regulators globally, told the thousands of media executives in attendance that the Isis movement has been strategically astute in using social media for propaganda and recruitment purposes. “The power of community is not lost on Isis and they are using it to great effect. Right now the voice of that community is a lot larger than ours, a lot louder, there’s more out there on the web. When I say ours, I mean all of us, all of us in the room today.” While an element of Google’s presentation was undeniably an anti-regulation plug about the virtues of Google’s global operations, the Silicon Valley firm made a case for an anti-Isis push. “The challenge for us is to strike this balance between allowing people to be educated about the dangers and the violence of this group,” said Grand. “But not allowing ourselves to become a distribution channel for this horrible, but very newsworthy, terrorist propaganda.”
Google’s talk covered a wide range of topics that the company is forced to make decisions on about censoring, from Robin Thicke’s misogynistic Blurred Lines video and a sexually explicit trailer for Lars von Trier’s film Nymphomaniac, to “prank” videos of teenagers inhaling deodorant spray cans, “how-to” euthanasia videos and police shootings in the USA.
However, arguably the most striking debate, which included the audience voting with red and green paddles about whether they would ban or approve material, involved Google’s discussion on terrorist propaganda. Google explained its justification for allowing graphic videos of the death of Neda Agha-Soltan, caught by an amateur in the aftermath of Iran’s elections in 2009, and the decision not to block the film of terrorists killing a policeman in the Charlie Hebdo massacre.
“At about 4:30am on the day [the Charlie Hebdo attack] happened, we got a call from our French colleagues asking what to do,” said Drummond. “As with other moment-of-death footage, we had to consider the dignity of the victim as well as the video’s news and commentary value. We decided to leave it up, and leave it up globally, [but not France where it is legally banned]. “It’s important to recognise here that the filming was done by a bystander who recorded the event. It wasn’t filmed by the perpetrators, wasn’t intended to terrorise anyone. Even though it is shaky footage, it became a critical part of piecing this event together, helping us to understand an event that happened far out of the media spotlight.” Google said this kind of footage – Grand described some of the films from repressed regimes as “shining a light in places that can be pitch black” with media blackouts – was justified but that Isis’s use of YouTube was not.
The company criticised outlets, including Fox News, for deciding to run full footage of the death of a Jordanian fighter pilot, which Google blocked. “Like the others, the purpose of this Isis execution of a Jordanian fighter pilot is to showcase in full high definition the most brutal way to die,” said Grand. “But a handful of mainstream broadcast outlets, including many outlets in the Middle East, as well as Fox News, made the decision to show this even though they wouldn’t show the beheading [of James Foley].” Grand said it was a “tough call” to ban the news organisations’ reports of the death that used the full graphic footage. “Yes, it was technically news but we decided that for some types of content, including Isis staged executions, the frame or news context put around it just can’t transform the original,” said Grand. “It was brutally violent propaganda produced by terrorists and we just don’t want YouTube to be a distribution channel for it.”
However, despite raising the calculated strategy of the Isis digital onslaught, Google argued that straight blanket censorship was not the answer. “Most of us, we want to see less violence in the world,” said Drummond. “We want alternatives. For many, the answer seems to be censorship. Although we take down the worst content from our sites, at Google, given the proliferation of content online we don’t believe that censoring the existence of Isis on Google, YouTube or social media will dampen their impact really. We think there is a better way to combat the hateful rhetoric of Isis, by countering it with reason. Understand it. Standing up to it. Enforced silence is not the answer. Drowning out the harmful ideology with better messages, with reasonable messages, is the better way.”
Google put forward a challenge to the advertising and marketing executives to help populate YouTube with more content that combats Isis propaganda. “We used to think of terrorists as people who are hiding out in caves But now would-be terrorists are hanging out online. Technology is one of the greatest tools we have to reach at-risk youth all over the world and divert them from hate and radicalisation. We can only do that if we offer them alternatives. Only on open and diverse sites like YouTube… that we can find these countervailing points of view”.
© The Guardian
On July 14, 2013, a white supremacist named Andrew Anglin, bewildered by black Americans' outrage over the shooting death of Trayvon Martin, began typing out thoughts on what he saw as a distorted world.
24/6/2015- "The whole George Zimmerman media psycho-drama has been completely insane from the beginning," Anglin wrote on the Daily Stormer, the neo-Nazi website he had started, after a jury acquitted Zimmerman in the shooting death of Martin the year before. Anglin called Martin, an unarmed, black 17-year-old, a "crazed, savage attacker" and warned of a conspiracy by "blacks and the Jewish media" to cast the justice system as biased against blacks. Born amid a backlash against the post-Trayvon Martin movement drawing attention to racial bias, the site has exploded to prominence among white supremacists as #BlackLivesMatter protests stretched coast to coast. According to the website traffic tracking site SimilarWeb, by the end of 2013 Daily Stormer had more visitors than the rival Vanguard News Network, which has been around since 2003.
The Southern Poverty Law Center, which monitors hate groups, said in a March report that during the previous six months Daily Stormer's Web traffic on some days even surpassed that of Stormfront.org, the oldest and largest hate site. Anglin, in an interview Tuesday with The Times, attributed his website's popularity to his approach, which avoids long, online essays in favor of short, catchy posts. "I wanted something punchy and funny and enjoyable to read," Anglin said. "My ideology is very simple. I believe white people deserve their own country.... There's not really anything that can happen that can affect my ideology because it's so simple and straightforward." The website also may have been one of the go-to places for Dylann Roof, the suspected shooter in last week's church massacre in Charleston, S.C. An analysis by the Southern Poverty Law Center showed comments on the site appear similar to passages from a manifesto on Roof's website. Asked about that analysis, Anglin said the Daily Stormer user account cited by the center — "AryanBlood1488" — had commented maybe 21 or 22 times, not enough to be considered much of a regular.
If there is one thing Anglin, 30, and the law center can agree on, it is that websites such as his offer highly clickable destinations for hate group advocates. On some days since the Charleston shooting, hundreds of thousands of people have been drawn to these sites, which are crammed with material packaged beneath headlines geared to their angry audien-ces. "Obama shamelessly uses atrocity to call for gun ban," a headline on the website read on June 18, the day Roof was arrested. Stormfront.org was created in January 1995. In the first quarter of this year, it had more than 1.9 million U.S. visitors, a drop from its peak of 3.5 million in the first quarter of last year, according to SimilarWeb. Daily Stormer has grown steadily, from 127,343 U.S. visitors in the third quarter of 2013, when it was launched, to 949,170 in the first quarter of 2015.
Roof, who has been charged with murder in the deaths of nine black worshipers at Charleston's Emanuel African Methodist Episcopal Church, is believed to have posted a mani-festo indicating he was inspired by material on the website of the Council of Conservative Citizens, a group formed in 1985 and whose website was created in 1996. Jared Taylor, a spokesman, said the group could not be blamed for Roof's rampage. "The impact on Roof obviously was terrible and unfortunate, and we completely, unequivocally condemn any kind of violence and illegality," Taylor said in a telephone interview. "But does that mean the council's website has some sort of responsibility for its actions? The answer is unequivocally no. We put forward information. What he did with this information is his responsibility."
Therein lies the danger of such sites, said Heidi Beirich, who heads the Southern Poverty Law Center's Intelligence Project. Most make a point of condemning overt acts of violence, even as they post reams of material aimed at fueling white rage and paranoia, she said. "They're smart enough not to make open calls for violence," she said. "It's all 1st Amendment-protected speech." But, Beirich added, "people are reading this stuff, they're sucking it in, and they're getting enraged, and we're having lone-wolf violence." On Tuesday, Anglin too disavowed responsibility for the Charleston shooting and condemned violence in general. "This is a news site. We report the news," Anglin wrote on the Daily Stormer. "We have an angle, just as everyone has an angle, but we are no more responsible for the actions of our readers than the Daily Beast is responsible for the actions of their readers."
FBI officials said they routinely monitor the websites to determine whether any are calling for a particular act of violence, which could lead to a criminal charge. The arrest and conviction of self-proclaimed white supremacist William A. White, they said, is a case in point. He was sentenced in federal court in Chicago to 42 months in prison in February 2013 for "soliciting violence" against the jury foreman in a case involving another white supremacist, Matthew Hale, who was convicted of soliciting the murder of a federal judge. White had used his site, Overthrow.com, to solicit "anyone" to kill the foreman, and he posted the foreman's home address and telephone number. The motto of Overthrow.com, which was affiliated with the National Socialist Workers Party, was to "fight for white working people." Since then, FBI officials said, most hate websites have been careful not to directly suggest violence.
On Friday, the day Roof was arraigned, Anglin had this to say:
"I don't support what Roof did, in any way, but there is now no going back from it," he wrote on his site. "We are in the middle of a race war. The random murders of Whites are going to begin any minute now, across the country. The media will try to cover it up, but there will be too many murders to hide."
© The Los Angeles Times
23/6/2015- The fake Czech-language Facebook group "Roma against Islam" (Romové proti Islámu) has published an altered version of a photo taken of a Romani demonstration in September 2013 in front of the Office of the Czech Government. On one the banners, instead of the Romani flag which was actually being held, the photo has been doctored to show a text that attacks refugees. If the administrators of the page do not remove the photo, representatives of the Romani Democratic Party (RDS) plan to file criminal charges. The altered photo was posted on Sunday, 21 June and immediately became very popular, being shared more than 2 600 times and receiving more than 1 500 "likes" as of noon yesterday.
Instead of the Romani flag and the inscription "Romani Democratic Party of the Czech Republic" (Romská demokratická strana ČR), those who doctored the photo inserted the following text between the hands of the two Romani people in the photograph: "STOP REFUGEE RECEPTION. Bohemia belongs to us and our white brothers. Smokes back to Africa! Your educated Roma." Representatives of the RDS have already objected to the falsified collage. "If the administrators do not remove those photographs, we will file criminal charges. This discredits the RDS and is an abuse of our political party," Miroslav Rusenko, political secretary of the RDS Central Committee, told news server Romea.cz.
In January, Romea.cz was the first to report on the existence of this false Facebook group. The authors of its material demagogically attack Islam per se and have falsely claimed to be supported by the Dživipen association and the Terne čhave music group. Both Romani initiatives have distanced themselves from the page. In February 2015, photographs were posted to the page of the non-existent group's alleged administrator and secretary, a certain "Ján Balko". It took just a couple of minutes for news server Romea.cz to search online and determine that the photograph was actually of a Nobel Prize winner for chemistry, Venkatraman Ramakrishnan. Many photos of him exist online.
The fraudsters did their best to make it as difficult as possible to identify the man in the photograph, for example, by reproducing it as a mirror image so it could not be found using Google's instruments for photo searches, but Romea.cz discovered its origins nonetheless. It is evident from the materials posted to the Facebook page of this non-existent group that its purpose has been to attract Romani people to demonstrations by the anti-Islamic group "We Don't Want Islam in the Czech Republic" (Islám v ČR nechceme) that were held at the beginning of the year. Those same people are behind another false Facebook page called Educated Roma (Vzdělaní Romové), which insults Romani people with would-be jokes. Facebook has so far ignored requests that these false groups be removed.
The European Court of Human Rights (ECHR) has ruled that an Estonian court was right to fine a news website for anonymous comments posted under one of its stories
16/6/2015- MEPs and anti-censorship groups say the judgement sets a "dangerous precedent" which could pave the way for similar cases. In the ruling, the Strasbourg-based ECHR upheld a court decision against the respected Estonian news organisation, Delfi, which had run an online article about a ferry company making controversial changes to its routes. The changes attracted widespread criticism from bloggers who posted 185 comments on the Delfi website, 20 of which contained personal threats and offensive language toward the ferry company’s majority shareholder. He took offence at the comments and the website agreed to remove them immediately but the owner decided to sue the site. In April 2006 the company was awarded 5,000 Estonian kroon.
Delfi, one of Estonia’s most popular news sites, appealed against the decision but, in a ruling, the ECHR has upheld the original decision by the Estonian courts. The unanimous ruling from the seven ECHR judges suggested that if a commercial site allowed anonymous comments, “it is both practical and reasonable for the site owner to be held responsible for them.” It said, “The applicant company [Delfi], by publishing the article in question, could have realized that it might cause negative reactions against the shipping company.”
Delfi is believed to be considering an appeal to the Luxembourg-based European Court of Justice but UK Independence Party leader Nigel Farage said the ruling set a “dangerous precedent” for freedom of expression and may dissuade websites from hosting anonymous comments. He said, "This ECHR judgement makes life incredibly difficult for the growing number of local news websites and blogs which provide a varied and valuable public space.
“These websites do not have the financial or human resources to fight malicious threats to their existence from either politicians or litigants. It is especially harsh as in this case the website owner had taken down the offensive comments as soon as possible. “The hard lesson to learn is that while the UK is a member of the EU, we must be subservient to the European Court of Human Rights in Strasbourg. I believe it is the British Supreme Court which should have the final say, not the ECHR.” Anti-censorship groups fear that news sites and blogs could now be held legally responsible for all the comments put up on their site even if they take them down after a complaint. Jim Killock, Executive Director of the UK-based organisaton, Open Rights Group, who described it as a “troubling ruling”, adding, "We all rely on defined 'notice and takedown' procedures. If courts don't respect the need for notice before takedown, then websites are going to find themselves in deep trouble."
His concerns are shared by Joe McNamee, Executive Director for European Digital Rights, who called the judgement “reckless” saying, "This baffling logic now appears to render it effectively impossible for an online publication to allow comments without positive identification of the end users. “Worse still, we know already that many publications already protect themselves by requiring people to log in to almost always American social networks to identify themselves. So much for the human right to privacy in the Convention. This will directly undermine individuals' rights to free speech and indirectly undermine their right to privacy." However, a spokesman for the ECHR stressed that the ruling was only in relation to the particulars of Estonian law and was not applicable to other cases, except by way of case law. “All this tells us is about Estonian law,” said the spokesman. “It is not applicable to other countries.”
© The Telegraph
15/6/2015- Belgium's national privacy watchdog is taking US internet company Facebook to court, arguing that the way the social network website tracks the behaviour of both members and non-members is illegal under Belgian and European law. “Facebook's behaviour is unacceptable”, Willem Debeuckelaere, president of Belgium's Commission for the protection of privacy, said. It is the first time a national privacy watchdog in Europe sues Facebook for not complying to privacy law. The basis for the case is research requested by the privacy commission and published in March, which noted that Facebook tracks user behaviour on non-Facebook websites by default until they opt-out, instead of after seeking permission.
“As emphasised by the [European data protection body] Article 29 Working Party, an opt-out mechanism “is not an adequate mechanism to obtain average users informed consent”, particularly with regard to behavioural advertising. This means that Facebook’s current opt-out approach does not satisfy the requirements for legally valid consent”, the researchers concluded. It also noted that Facebook tracks the behaviour of people who are not members of Facebook, which also violates the EU's e-Privacy directive. “Even people who explicitly state that they do not want to be tracked, are tracked anyway”, Debeuckelaere told Belgian newspaper De Morgen, which broke the story on Monday (15 June).
Last month, the Belgian privacy commission presented its findings and recommendations to Facebook, whose European office is registered in Ireland. “They answered that they do not accept Belgian law or the authoritiy of the Belgian privacy commission, and that it is all a misunderstanding”, said Debeuckelaere. A Facebook spokesperson told this website in an e-mail the company is “confident that there is no merit” in the case by the privacy watchdog, known in Belgium by its acronym CBPL. “We were surprised and disappointed that, after the CBPL had already agreed to meet with us on the 19th June to discuss their recommendations, they took the theatrical action of bringing Facebook Belgium to court on the day beforehand.”
A court in Brussels will hear the case on Thursday (18 June). Willem Debeuckelaere told this website in an e-mail the date for the hearing, one day before another CBPL-Facebook meeting, is a coincidence. It is not the first time Facebook has come under fire over privacy in Europe. Austrian Facebook user Max Schrems has taken his case against Facebook all the way to the EU's Court of Justice. He announced last week an opinion by adocate general Yves Bot, which was scheduled for 24 June, has been delayed. EU ministers, for their part, will Monday discuss setting up a European privacy watchdog.
© The EUobserver
13/6/2015- Social networking giants Facebook and YouTube removed only one-third of antisemitic and anti-Israel uploads that were posted this past year, the watchdog group Israeli Students Combating Antisemitism (ISCA) reported. ISCA intends to present the data it has collected on antisemitism on social media to the fifth annual conference of the International Forum to Combat Antisemitism, which is being held over Tuesday, Wednesday and Thursday in Jerusalem. According to the data collec-ted by the ISCA, 15,965 complaints sent to YouTube, Twitter and Instagram led to the closure of only approximately 5,000 pages with antisemitic content. Many of the pages that remained active featured Holocaust denial – prohibited by law in some European countries – in addition to well-known antisemitic tropes about Jewish money controlling the world or that Jews “control Washington DC,” among other offensive stereotypes, Israeli news portal NRG reported on Tuesday. In many of the cases, antisemitism on these pages masqueraded as pro-Palestinian activism.
Ido Daniel, the ISCA’s director, said, “The main problem is that Facebook until this day does not know what antisemitism is.” He added that, “it and other social networks find it difficult to identify antisemitism when we raise the problem with them. Sometimes many days go by before the complaint is received, if at all.” He continued, “my staff and I have trouble finding pro-Palestinian or anti-Israel pages that do not contain antisemitism. They themselves cannot separate legitimate criticism of Israel from conspiracy theories of the worst kind.” Gideon Behar, the director of the Israeli Foreign Ministry’s Department for Combating Antisemitism said he agrees with Daniel that the social networks are not doing enough regarding this issue. He said that, “internet companies and site operators do not understand that virtual antisemitism can become real antisemitism.”
He noted that these major internet companies only sent junior representatives to the International Forum to Combat Antisemitism. He called on the tech giants to “be part of the solution to combat antisemitism and to protect their users from this phenomenon.” Behar added that, “we look forward to participation from more senior staff. Even in everyday life, we do not see them taking any real action. They must internalize that hatred on the networks translates into actions against Jews in real life. Antisemitism is not virtual.” He asserted that the flooding of social networks with hatred of Jews and the growing antisemitism in Western Europe are the central issues facing Israel and the Jewish people. “Nine Jews were killed in three separate attacks this past year in Western Europe. This is unprecedented,” Behar said, accusing the EU and Western European countries of taking insufficient steps to stop the wave of hatred against their Jewish citizens. “Over the past two years, the two centers where we have identified a significant growth in antisemitism are in Western Europe and online social networks.”
© The Algemeiner
A Canadian Muslim advocacy group has launched a website to track Islamophobic vandalism, hijab pulling and similar forms of bigotry, as recent statistics show an uptick in hate crimes against Muslims between 2012 and 2013.
9/6/2015- Released Tuesday by Statistics Canada, the data revealed a 17 percent overall plunge in hate crimes reported to police during the one year period. But concerns remain for a number of groups who are still disproportionately targeted, including Muslims, blacks, and Jews, and for sexual minorities, who are more likely to face violence. The total number of reported hate crimes in Canada was 1,167 in 2013, down from 1,414 in 2012. About half of those were motivated by racial prejudice, while 28 percent stemmed from bias against a religion, followed by 16 percent based on sexual orientation. Overall, both religiously motivated and racial crimes were down. At 181 crimes, Jews still made up more than half of victims targeted for their religion, while black people made up the bulk of the race category with 255 incidents. Muslims, however, saw an increase from 45 incidents in 2012 to 65 in 2013.
Canada's hate crime rate is far lower than some other countries. UK statistics show over 44,000 hate crimes for England and Wales alone. Given the population differences, that's about six and a half times the Canadian rate, though Statistics Canada only includes cases "substantiated by police" while the UK data cover those "recorded by police." A US survey published by the Bureau of Justice Statistics found 293,000 cases, though the number proven by police is far lower. These differences highlight an obvious problem with the Canadian data. Statistics Canada notes that hate crimes are notoriously underreported, and estimates that just over one third of victims come forward, based on a past study.
The National Council of Canadian Muslims is looking to close that gap by giving victims a platform for reporting incidents online, even when they're afraid to go to police. When confronted with hate, users are asked to submit an account with as much detail as possible, including the time, date, photos, and a description of what happened, all of which is plotted on an interactive map. Amira Elghawaby, the group's human rights coordinator, told VICE News that their data shows a further doubling of anti-Muslim incidents between 2013 to 2014, suggesting the trend is continuing. She said that vandalism is the most common problem, but that assault and harassment are frequent enough, especially for women wearing headscarves.
"The majority of individual assaults target women who are visibly Muslim," she said. "Women who are wearing the hijab are, by far, most frequently the victims of hate crimes." Elghawaby says she saw spikes in hate crimes immediately after the Charlie Hebdo killings and the Parliament Hill shooting. Stigmatization and a sense of futility often discourage people from reporting hate to police, she said, telling the story of two Muslim girls who had their hijabs pulled off by a substitute teacher during class, but were reticent to speak up and report the incident to police. "They didn't want to go public or press charges because they didn't want to bring negative attention to themselves, or to the school," she said. "It was difficult for the two girls. They're young, and they didn't want to testify in a courtroom."
For this reason, the site will allow users to remain anonymous in their posts, though the group will keep their name on file so they can verify the incident. Elghawaby said that the posts will not reveal the identity of perpetrators, however, for fear of endangering people with unproven allegations, although they may link to media reports. "Our aim in launching a national hate crime awareness project is to urge Canadian Muslims, as well as fellow Canadians, to report hate wherever and whenever it happens so that we can find ways to combat it," she said. Apart from anti-Muslim hate, the Statistics Canada data weren't all rosy of course. Hate crimes based on sexual orientation stayed pretty stable between 2012 and 2013, and were particularly likely to include acts of violence. While about half of the abuse directed against other groups were acts of "mischief," like graffiti or property damage, two thirds of crimes against sexual minorities were violent in nature, with threats and assault remaining quite frequent.
Though the anti-hate website will not specifically address homophobic or transphobic incidents, a spokesperson for an LGBTQ rights organization joined Elghawaby at a press conference to support the initiative. "Unless we address the root of the problem — the hateful idea that one group of people can be set above the rest; that one set of charac-teristics is 'normal' and therefore superior to all others—we will never be successful in addressing any one of its symptoms," said Ryan Dyck of Egale Canada. The statistics showed that Thunder Bay and Hamilton had the highest rates of reported hate crime across the country, with both surpassing five times the national average. Urban areas were the site of almost three quarters of reported hate crimes, with half in Toronto, Montreal and Vancouver alone, although the National Council of Canadian Muslims noted that victims might be even less likely to go to police in rural areas.
© Vice News
Mr. Ronald Eissens is the General Director and Co-Founder of the Dutch NGO Magenta Foundation, which focuses on international human rights and anti-racism. He also founded The ‘International Network Against Cyber Hate’ which fights discrimination and other forms of cyber hate. We discussed how to fight hate speech, online antisemitism and we talked about the consequences of the rise of antisemitism in Europe as well as the situation of Turkish Jews.
By Karel Valansi
10/6/2015- The Global Forum for Combating Anti-Semitism ended recently in Israel. Do you consider it successful? Is there an action plan?
In my view it was more successful than the previous ones, since it was very much geared towards action. For now there is a summary of recommendations. Some of these are quite important like these three:
# Adopting a formal, legal definition of anti-Semitism. This definition will include attacks against the legitimacy of the State of Israel and the denial of the Holocaust.
# Strengthening legislation against anti-Semitism and the training of police in better enforcing existing laws.
# Education ministries in Europe must promote education to religious tolerance and preserving the memory of the Holocaust.
Summary of the recommendations
How do you describe hate speech? Words are powerful and have consequences. They can be used as a tool of propaganda but at the end they can create a new mindset…
Hate speech is any speech that sets out to dehumanize, discriminate, defame, vilify or insult a group of people on the basis of gender, ethnicity, religion, sexual orientation, skin color, or any speech that incites to violence and murder against said groups. Of course speech like this in the end polarizes society, poisons it even, so that an atmosphere is created in which negative action can be taken against certain groups, e.g. Jews.
Social media distributes hate easily, just by a like, a fav or a retweet. What can be done to fight online anti-Semitism?
A lot. Removal of antisemitic content. When necessary, legal action. In countries where there is no hate speech law, lobby should be started to introduce this law. Counter-speech by directly engaging those who do hate speech. Education; training children in anti-bias, training them media literacy (how to handle the internet, what is hate speech, how to distinguish between valid and false information, how to debunk conspiracy theories and Holocaust denial, where to find tools and information for this.
What do you mean by counter-speech?
This is a pilot project which our Dutch Bureau has been doing. Counter Speech is engaging online discrimination, racism and prejudice through various methods, including countering 'bad speech' with 'good speech', arguing on blogs, discussion groups and web forums and in the social media, dispelling myths using hard facts and information, debunking stereotypes and assertions and doing positive campaigns.
How does the International Network Against Cyber Hate fight cyber hate?
INACH, The International Network Against Cyber Hate, fights cyber hate by exchanging information, by doing joint actions against hate sites and other expressions online, by working with the social media and by lots of other things. Have a look here, you will get an impression.
Is there a contradiction with freedom of expression?
No, not really. Freedom comes with responsibility to protect all citizens. In a few countries, like the Netherlands, the freedom of speech is curbed by anti-hate speech legislation. The rationale for this is quite simple. History has shown that if we let hate speech run rampant, eventually this will lead to a take-over by the haters. In other words, democracy would be abolished and a dictatorship would be ruling. What is the first thing dictators do? The abolishment of free speech. So, hate speech legislation is there for a very good reason: the protection of democracy and the prevention of (ultimately) genocide.
A record number of French Jews made Aliyah. There were clear signs of anti-Semitism in Europe already. On the other hand Islamophobia is also rising and we see that the far right parties are gaining popularity. Where is Europe heading to?
As it looks now, Europe is heading for major problems with regard to both Right-wing extremism and Muslim-extremism. The first directed at Muslims and Jews, the second directed at Jews and ‘the West’ and basically anybody non-Muslim. For the longest time, Europe has neglected to take hard action against the extreme right and has failed to see in time that antisemitic sentiments run high in Muslim migrants and parts of the European Muslim communities have become breeding grounds for violence and terrorism directed at Jews and the countries itself. Since the Muslim population of Europe is by now quite large, politicians have also started pandering to the Muslim vote and are avoiding measures that could aggravate these voters. We will see more antisemitism and violence and terrorism. We will see more Aliyah. This can only be stopped if European politicians realize that they should act now.
Anti-Semitism has always been an issue in Europe. Can we say that there is a renaissance in anti-Semitism or it has a new form; in the form of anti-Israel and Holocaust denial?
‘Old wine in new skins’, as my panel at the Global Forum was called, is I think very apt. Anti-Zionism and anti-Israelism is just another obfuscation of anti-Semitism, in a way a very convenient way to hide that people are in fact antisemitic.
You have been to Turkey before, what do you think about the situation in Turkey? Approximately 17,000 Jews live in Turkey, where 69 % of the general population was revealed to hold antisemitic views according to ADL Global 100 Index.
I think Jews in Turkey are, like Jews in too many other countries, very much under threat. How long people stay, is a matter of how much communities and individual feel under threat. I do not think Turkey is worse than, for example France or The Netherlands. The difference is that in Turkey antisemitic outbursts are more visible, also by politicians. While in the Netherlands and France and other European countries the anti-Semitism is mainly ‘under water’ but is increasingly coming above water, also in the mainstream, and these days mainly coming out of the Muslim communities, the left-wing, and the mainstream.
© Salom Turkey
Most Americans support expanded federal hate crime laws, but are divided on banning hate speech.
20/5/2015- Since 1994 people convicted of federal crimes motivated by the 'actual or perceived' identity of victims have faced tougher sentences. Many other states had passed 'hate crime' statutes in earlier years, and in recent years many states have been adopting laws which make crimes motivated by the victim's sexual orientation of gender identity hate crimes which face tougher sentences, something the federal government did in 2009. Unlike much of the rest of the developed world, however, the United States does not make it a criminal offense for people to make statements which encourage hatred of particular groups. For example a prominent British columnist, Katie Hopkins, is being investigated by the police for referring to African migrants crossing the Mediterranean as 'cockroaches'.
YouGov's latest research shows that many Americans support making it a criminal offense to make public statements which would stir up hatred against particular groups of people. Americans narrowly support (41%) rather than oppose (37%) criminalizing hate speech, but this conceals a partisan divide. Most Democrats (51%) support criminalizing hate speech, with only 26% opposed. Independents (41% to 35%) and Republicans (47% to 37%) tend to oppose making it illegal to stir up hatred against particular groups.
Support for banning hate speech is also particularly strong among racial minorities. 62% of black Americans, and 50% of Hispanics support criminalizing comments which would stir up hatred. White Americans oppose a ban on hate speech 43% to 36%.
When it comes to crimes motivated by hatred, most Americans do back the current federal hate crime laws, including the expanded definition of hate crime passed in 2009. 56% of Americans back the federal law mandating tougher penalties for cimes motivated by race, religion or gender, and 51% support expanding that to include sexual orientation, gender identity and disability. Democrats (68%) tend to be much more supportive of the law than either independents and Republicans. Republicans (38% to 39%) are split over the expanded definition of hate crime, while independent tend to support (46%) rather than oppose (28%) it.
Full poll results can be found here and topline results and margin of error here.
Cyber racism is nothing new, and the purveyors of online hate are at it again. If you don’t believe it, just go to Google Maps on your smartphone or computer and type in the words “ni**a house Washington” or “ni**er king” or even “ni**er house.” And what do you get? Google takes you to 1600 Pennsylvania Avenue. That’s right, the White House, where the black folks live.
by David A. Love
20/5/2015- To its credit, Google apologized for the mishap, without explaining why this is all happening in the first place. “Some inappropriate results are surfacing in Google Maps that should not be, and we apologize for any offense this may have caused. Our teams are working to fix this issue quickly,” said a Google spokesperson, as reported in The Guardian. This is not the first time Google has found itself in such a predicament. The company has had issues in the past, and they have to be more diligent. Back in 2010, Google’s auto-complete search engine suggested racist queries after the user typed in “why.” Moreover, its ad system was 25 percent more likely to display ads for criminal background checks after typing in black sounding names.
Contrary to the belief that racism is going away, in many ways, it has become more recalcitrant. And with online technology, that racism has become more sophisticated, with cyber racists digging in their heels and lashing out anonymously. Just ask black writers what they face on a daily basis with online racist threats and harassment on social media, as Juan Thompson perfectly lays out in The Intercept. All of this puts in perspective the comments that first lady Michelle Obama made recently at Tuskegee University on the indigni-ties black people, including the Obamas, face on a daily basis. Everyone knows the stories of the racist email messages sent by Ferguson city employees, or emails that have circu-lated among Republican politicians depicting the Obamas as monkeys, President Obama as Ronald Reagan’s pet chimp and the White House lawn as a watermelon patch.
Even the Obama daughters are not immune to E-lynching. When a photo surfaced of Malia wearing a Pro Era t-shirt, for the Brooklyn-based hip hop collective, the right wing took to social media with racist attacks on her, calling her the N-word and accusing her and her father of being and anti-cop and anti-white. And when a GOP aide from the Tea Party lashed out at the Obama daughters Sasha and Malia by calling them classless dressers, #blacktwitter, which wasn’t having it, decided to call her out on her racism and shut her down. Meanwhile, I just typed in “Ni**er University” on Google Maps, and Howard University popped up. Still not fixed. If we didn’t live in that post-racial America some folks claim we’re now in — now that we have a black president, who apparently is called the N-word often — it would be enough to make you really angry. Google, you need to tighten up those algorithms, or whatever you do in situations like this.
© The Grio
Russia’s communications watchdog has threatened to fine Facebook, Google and Twitter and block their services under a controversial law on blogging.
20/5/2015- In a letter to executives on on Monday, the director of the communications oversight agency warned that the three US companies could face sanctions if they continued alleged illegal activities in Russia, Izvestia newspaper reported on Wednesday. Any action could affect a number of social media sites: besides its eponymous social network, Face-book also owns the photo-sharing service Instagram, while Google owns YouTube, BlogSpot and Google+. Facebook and Twitter, in particular, have been instrumental to organisers of opposition protests in Russia, where the major television news channels are controlled by the state. A spokesman told the state news agency RIA Novosti that the watchdog’s com-plaints related mainly to deleting pages with extremist materials and receiving information under what is known as the “bloggers law”. This 2014 legislation requires popular bloggers to register their real identities with the authorities, a measure that prominent bloggers say is designed to have a curb free speech and criticism of the regime.
The agency’s deputy director, Maksim Ksenzov, had issued a warning to the three companies on 6 May, telling them they were in violation of the bloggers law because they had not provided requested data on the number of daily visitors to several users’ pages, as well as information allowing the authorities to identify the owners of accounts with more than 3,000 daily visitors. According to the law, the agency can fine a violating organisation up to 300,000 roubles (£3,850); a second infringement can incur a fine of up to 500,000 roubles or a suspension of its operations for up to 30 days. If the companies did not take steps to delete from their sites “information containing calls to participate in mass rioting, extremist activities” or unsanctioned public events, the watchdog would “limit access to the information resource where that information is posted”, Ksenzov warned.
Although Russia’s four major internet providers are reportedly able to block the URL of specific pages on social networks and other websites, regional providers with less exact technologies have in the past been forced to block entire services due to one offending page. According to the communications oversight agency’s data, 70% of registered internet providers are able to block separate URLs, Izvestia reported. Since the start of President Vladimir Putin’s third term in 2012, the government has launched a crackdown on the internet in Russia, passing laws that give state supervisory bodies wide-ranging powers to to regulate and block websites. Several news sites critical of the Kremlin have been blocked in Russia, including Grani.ru, EJ.ru and Kasparov.ru, which was founded by self-exiled chess grandmaster and activist Garry Kasparov. Facebook blocked 55 pages at the communication oversight agency’s request in the second half of 2014, including one inviting people to a rally in support of opposition leader Alexei Navalny. Google was forced to move some servers to Russia this year under a law requiring Russians’ personal data to be stored on its territory.
© The Guardian
A police investigation launched nearly two months ago into racist social media pages continues.
20/5/2015- Thunder Bay Police Service spokesman Chris Adams on Tuesday confirmed officers are still looking into a series of Facebook pages which have popped up, which police previously labelled as “extreme racism” and said appeared to target the city’s Aboriginal community “That investigation is still ongoing. Anything that involves an Internet based organization such as Facebook is going to take time,” he said. “It’s still on the radar for us.” While he did not want to speculate on the outcome, possible charges could include harassment or even hate crime.
It’s not a cut and dry case.
“In general, these are tough investigations because of the layers and the fact you’re dealing with organizations that are very much international in scope and you have to weight that against the right of free speech and the ability to be photographed when you’re out and about,” Adams said. “There are a lot of issues that are tied to it and the laws that are in place don’t always address those. Let’s face it, technology outstrips everything we’re used to right now and we’re still coming to grips on how to keep up with it.”
© TB News Watch
Intervention by police within the first 24 hours of a terrorist event could be key to halting the spread of cyber-hate, a new study has found.
22/5/2015- Cardiff University researchers found online hate in the aftermath of the murder of Lee Rigby peaked in the first 24 hours then declined sharply. They found tweets from police and media were about five times more likely to be retweeted compared with tweets from other users following the attack. The fusilier was killed on 22 May 2013. The research is being published in the British Journal of Criminology on the second anniversary of his murder near Woolwich Barracks in London. Michael Adebolajo and Michael Adebowale drove into the 25-year-old before hacking him to death. During the study, social and computer scientists at the university focused on the production and spread of racial and religious cyberhate and the Twitter battle between police and far-right political groups in the first 36 hours following the attack.
© BBC News
Manchester's school kids are more than thirty times more likely to become the victims of hate crimes over social media than in person.
17/5/2015- A Freedom of Information request submitted to Greater Manchester Police produced some harrowing statistics on reported hate crimes relating to social media throughout the region’s schools. It is a well-documented development that sites and apps like Facebook and Snapchat are the new battleground for young people, but in a society of increased inclusion and tolerance, these figures indicate a worrying trend. The findings showed that across the last 12 months, 45 direct hate crimes had been reported across Greater Manchester’s primary and secondary schools. In contrast, a staggering 1391 cases were reported which related to social media, including Facebook, Twitter, Snapchat, eBay, Instagram and even Tinder.
Andrew Bolland, from Stop Hate UK, told MM: “Our service has seen a growth in the use of social media as a means to direct hostility and abuse. “The underlying issue is that perpetrators believe that they are invisible and would be impossible to trace, not realising that it is regarded as seriously as direct abuse from a criminal justice perspective.” A hate crime is defined as any criminal offence which is perceived by the victim or any other person to be motivated by hostility or prejudice based on characteristics or perceived characteristics defined within five monitored groups, also known as ‘hate motivations’. These five groups are disability, race, religion, sexual orientation and transgender identity, but GMP also records crimes motivated by someone's perceived or actual alternative subculture identity. One of the concerns that naturally accompany hate crimes via social media as opposed to direct cases is the potential ripple effect of prejudicial messages to an almost infinite network.
Mr Bolland explained: “A big problem is the ‘mushroom of media’. If someone has a hundred friends, the abusive message spreads through a vast network of people. “It only takes a small number of people to see that message without a proper understanding of the issue, and suddenly it increases and escalates.” He suggested that those convicted of such crimes could be brought together with victims, so as to create a mutual understanding and to make perpetrators aware of the impact their behaviour has on people’s lives. But first and foremost, tackling the issue starts with preventative action and ironing out misguided prejudices through education. He added: “Action needs to be taken to educate younger people early on the issue of discrimination, particularly with regards to social media, and prevent these situations arising in the first place. “We need to try to reach them through schools, youth groups and so on.”
© Mancunian Matters
A council has come under fire for its handling of an investigation into racist messages which appeared on a taxi boss’ Facebook page.
15/5/2015- Viv Young, who has a share in Portsmouth cab firm City Wide Taxis, has been allowed to hold on to his hackney licence despite Portsmouth City Council being made aware of a stream of abusive messages which surfaced on his account:
I have no grievance with anyone. I have sponsored a little girl in Africa for years.
Taxi trade official and City Wide Taxis shareholder Viv Young
The News reported the shocking online posts to the authority, which included: ‘I was driving me cab today and picked up a tribe of, shall we say (not typically English coloured people). ‘I was wondering by having them in my cab, am I leaving myself open to catch malaria, cholera, dengi fever, and of course tics??????’ Another post of a man wearing a burkha while holding a can of lager was uploaded – with each getting likes from Facebook users – while another message called Muslim ‘nonces’. The council’s licensing team looked over the evidence presented to it, before a committee at a meeting decided no action would be taken against Mr Young, saying the Facebook messages were private and that no complaints had been made by members of the public.
At the hearing, Mr Young insisted his Facebook profile had been set to private – which The News knows not to be true as the messages were viewed by reporters before Mr Young deleted his account. Questions have now been raised as to why the council did not look further into whether the posts were public – and why it needed further complaints to take action. Jabeer Butt, deputy chief executive of The Race Equality Foundation said: ‘I don’t know a great deal about how the taxi trade is regulated, but part of those regulations states the safety of customers is paramount. ‘It seems odd Portsmouth City Council has managed to carry out an investigation where someone has repeatedly posted such state-ments about these customers, and concluded no action should be taken. ‘Clearly, there are some terrible things said on the internet and it’s become a haven for people to make very abusive comments.’
Lib Dem Councillor Gerald Vernon- Jackson said: ‘This is not acceptable. ‘It’s pushing the onus on to the victim to put in a complaint. ‘It’s not right people can be putting out racial abuse – then doing this is saying “it’s all right”.’ Mr Young, who received the support of other taxi drivers at the meeting, many of whom were from ethnic minority backgrounds, said some of the posts on his Facebook wall appeared as they had been shared by other users and had not directly been written by him. And Mr Young defended himself by making references to Nazi leader Adolf Hitler’s book Mein Kampf. He said: ‘Have you read Mein Kampf? If you don’t want to read it then don’t –it’s not compulsory. My Facebook (page) was personal – if you don’t like it then don’t look or delete me.’
In his statement at the meeting, overseen by Cllrs Hannah Hockaday, Lee Mason and Sandra Stockdale, Mr Young said: ‘This is a News witchhunt. ‘Some parts of the postings may have been taken out of context. ‘There were no poppy burnings or pig beheadings. I have no grievance with anyone. I have sponsored a little girl in Africa for years. ‘Portsmouth City Council councillors and officers come under the political umbrella and speak as such – I use industrial language. ‘I’m surprised and disappointed by reactions, this should never have got this far. ‘There have been no complaints from the public. I am one of the best drivers in Portsmouth. In my opinion there’s no case to answer.’
Cllr Hockaday, committee chairwoman said: ‘The comments made were private. Mr Young has friends from a diverse range of background. ‘We will renew his licence but will make it clear that if there’s any recurrence of this then we will revoke it in the future.’ She also said the comments were ‘disappointing’ and can be seen as disrespectful by members of the wider community.’ ‘You don’t have to wait for someone to complain to take action on this,’ she added. Cllr Hockaday was unavailable for comment following the hearing to explain the decision further.
© The Portsmouth News
Suicide pacts, dates to hunt down homosexuals or to torture homeless people: For children and teenagers, danger lurks not in the streets, but on the Internet, a German child welfare organization says.
13/5/2015- The Internet plays a huge role in most children's lives - also here in Germany. Young people these days are increasingly online with their smart phones, and often enough, they end up on websites that weren't necessarily designed for them. The German youth protection website "jugendschutz.net" has documented what youngsters are likely to come across on the Internet. The organization, which is linked to the Commission for Youth Media Protection (KJM), monitors the Internet for content harmful to minors. "Jugendschutz.net" gives parents and teachers guidance; for instance, in a newly updated booklet entitled "A net for children – Surfing without risk", commissioned by the Family Ministry in Berlin (BMFSFJ). The organization's 2014 annual report takes a close look at mobile communication, and points out the most problematic issues:
1. Risk of self-inflicted harm. Always on the lookout for cool things to do, even elementary schoolchildren get together to collectively swallow a mix of baking-powder and vinegar - trivialized as a dare, it's a lethal, explosive cocktail in the children's stomachs. Glamorizing anorexia is another fad, even to the point of virtual hunger Olympics: Who weighs even less?
2. Announcing suicide. In 13 cases, life-threatening situations forced "jugendschutz.net" to call the police. Sometimes, people seek partners for a suicide pact via the Internet; the young people also discuss procedure: whether it's better to jump from a tall building or in front of a train.
3. Sexual posturing. Photos and videos of teenagers in sexualized poses are particularly frequent. Such content is almost always generated through foreign servers, mainly Dutch, US and Russian. Often these pages can be deleted, but usually only after five days.
4. Sexualized violence via download. In 2014 alone, "jugendschutz.net" took action 1,168 times against cases of portrayals of sexual abuse in the net - significantly more than the previous year.
5. The lure of jihad. Political extremism also has a toehold on the Internet, and has long since discovered children as a target audience. Islamist videos do a brisk business: they use music and quick cuts, which comes across as professional and caters to what adolescents are used to seeing. In "Flames of War", the jihad is portrayed as an adventure for teenagers - this film and others are widely distributed via YouTube and Facebook.
6. Hunting down "the other". Videos that show violent attacks on gay or homeless people, or drug addicts, can easily get more than 100,000 clicks. Some show systematic torture. Clips by the neo-Nazi Okkupay-Pedofilyay movement, founded in Russia, are particularly notorious. The videos are circulated at a furious pace on Facebook, YouTube and via the Russian VK network.
7. Rightwing Muslim-bashing. Social media are a key platform for the far right, where they ridicule mainly Muslims, and where they liken them to athlete's foot and trash. The bashing follows a pattern: the more provocative the insult, the more clicks it gets. It's a snowball effect.
© The Deutsche Welle.
At Jerusalem Global Forum for Combating Anti-Semitism, Google policy chief explains how users are key in the fight against online hate.
13/5/2015- Internet giant Google is making strides to combat online anti-Semitism, but executives insist the most effective way to counter hate online is by activists creating an effective counter-narrative. Speaking in Jerusalem at the 5th Global Forum for Combating Anti-Semitism, Juniper Downs, the Senior Policy Council for Google US, said her company has teams working "24 hours a day" to locate and remove hate content on platforms such as YouTube. Offending items - including anti-Semitic propaganda - are regularly removed, and users responsible for "particularly egregious" posts are often banned altogether, she said. The sophisticated systems Google employs include a "turbo-charge flagging" system, which allows developers to identify "trusted flaggers" - users with a good track record of flagging-up inappropriate material.
That system is a crucial tool in ensuring the system can't be abused, as users will often simply flag items they just don't like; for example, Downs noted that the most-flagged YouTube video is a fairly innocuous music video by pop singer Justin Bieber, whose off-stage antics have made him a deeply unpopular figure, despite the video itself being totally inoffensive. Context is also key, she added. In the past, some media watchdogs and anti-hate groups - including the Middle East Media Research Institute (MEMRI), which monitors Arab and other Middle Eastern media - have found themselves subjected to temporary bans, after users flagged "hate speech" in videos which were in fact meant for the purpose of exposing incitement.
Downs said Google now ensures footage containing hate speech or other forms of anti-Semitism or racism aren't removed if they are included for "documentation or condemnation" of bigotry - an important distinction to avoid collateral damage against groups fighting to expose online hate. But while Google is working hard to remove online incitement, Downs also emphasized that ultimately, simply removing the videos was just one part of the fight against anti-Semitism and hate-speech. "We know that on our platform there are pieces of content which just go too far, and we take them down," she said. But more sophisticated anti-Semites have found ways of getting round the hate speech rules, by treading the grey line between racism and offensive - but permissible - material.
Moreover, removing hate-speech only deals with the tip of the iceberg, but doesn't address or combat the poisonous narratives which fuel the hate. "We’re troubled by the hate speech we find on our platform, but we're even more troubled by the fact that it represents sentiments which still exist today," she said. To tackle the problem at its root, users have to get more involved in actively countering the narratives of hate, Downs insisted. "Counterspeech is the most effective strategy in doing the real hearts and minds work," she stated. Paraphrasing Prime Minister Binyamin Netanyahu, who spoke at the opening evening of the Forum on Tuesday night, Downs compared what she terms "counterspeech" to lighting candles in the dark. "It's not a matter of lighting a single candle - we need millions of candles" to dispel the darkness of racism, anti-Semitism and other forms of bigotry online.
Educational videos providing facts and dispelling the arguments of anti-Semites is one tactic often effectively used, but Downs noted that some of the most popular anti-hate videos were those using satire. "Satire is a common and effective way of countering hateful views," she said. To help mobilize users, Google has been conducting offline events, bringing successful YouTube video producers to coach activists on how to be most effective in getting their message across. It hopes those events could ultimately empower users to do what the multinational technology behemoth can't - neutralizing the narratives of hate at their source and shutting down the demand for such material. The Global Forum for Combating Antisemitism is held every other year, under the auspices of the Foreign Ministry and the Ministry for Diaspora Affairs. It allows experts on anti-Semitism from around the world meet and share ideas over three days.
© Arutz Sheva
The biennial Global Forum for Combating Anti-Semitism issued statements recommending steps for governments and websites to reduce cyber hate, and for European governments to reduce anti-Semitism.
14/5/2015- “Given the pervasive, expansive and transnational nature of the internet and the viral nature of hate materials, counter-speech alone is not a sufficient response to cyber hate. The right to free expression does not require or obligate the internet industry to disseminate hate materials. They too are moral actors, free to pursue internet commerce in line with ethics, social responsibility, and a mutually agreed code of conduct,” read a statement issued Thursday night in Jerusalem by the Forum. Among the recommendations to Internet providers: to adopt a clear industry standard for defining hate speech and anti-Semitism; adopt global terms of service prohibiting the posting of such materials; provide an effective complaint process and maintain a timely and professional response capacity; and ban Holocaust denial sites from the Web as a form of egregious hate speech.
Recommendations to governments include: establishing a national legal unit responsible for combating cyber hate; making stronger use of existing laws to prosecute cyber hate and online anti-Semitism, and enhancing the legal basis for prosecution where such laws are absent; and adopting stronger laws and penalties for the prohibition of Internet materials promoting terrorism and supporting recruitment to terrorist groups. The forum also addressed the upsurge of anti-Semitism in Europe. “European institutions and governments need to take strong proactive steps to address the current outbreak of anti-Semitism in order to assure the continued vibrancy of Jewish communal life in Europe,” read a statement issued Thursday.
Among the recommendations for combating anti-Semitism: adopt a formal definition of anti-Semitism applicable throughout the European Union and its member states under law including reference to attacks on the legitimacy of the State of Israel and its right to exist, and Holocaust denial as forms of anti-Semitism; applying agreed standardized mechanisms for monitoring and recording incidents of anti-Semitism in all E.U. countries; taking urgent and sustained steps to assure the physical security of Jewish communities, their members and institutions; and directing education ministries to increase teacher training and adopt pedagogic curricula against anti-Semitism, and towards religious tolerance and Holocaust remembrance.
The three-day conference hosted a panel of prominent Muslim leaders and imams from Europe who came to speak out about anti-Semitism in Europe. The opening of the conference featured addresses by the mayor of Paris and the German justice minister.
© JTA News.
12/5/2015- An Israeli court on Tuesday sentenced a Palestinian for incitement and for supporting a terrorist organization based on Facebook posts that applauded militant attacks, his lawyer said. It was a rare case in which statements on social media were regarded as a crime. The defendant, Omar Shalabi, 45, a father of six from East Jerusalem, was sentenced to nine months in jail for 10 posts to his 5,000 friends and 755 followers that urged them to undertake “violent acts and acts of terrorism,” said the Hebrew-language indictment. Legal rights groups said it was unusual for an Israeli court to accept speech on social media as a basis for conviction. But they said that in recent months the Israeli police had detained several Palestinians from East Jerusalem and Arab citizens of Israel for incitement over comments made on their social media networks.
Mr. Shalabi’s posts included a photograph of a Palestinian man who was killed after he plowed his car into a group of pedestrians in Jerusalem, killing a baby. Four days after the attack, he wrote, “Hundreds of Jerusalem’s men are rising from their graves, and from under the hands of deprivation to cheer the soul of the martyr,” according to the indictment. Another post praised two cousins who had stormed into a Jerusalem synagogue in November, killing five men. Mr. Shalabi posted the photographs of the attackers and wrote: “Ask death to grant you life; glory is bestowed upon the martyrs.” “These posts motivated other Facebook users who shared the inciting contents with their friends and followers, who in turn supported the posts by pressing the “like” button,” the indictment said. “The mere use of this media, as the defendant has done, serves as a severe act, given the extensive circulation of the messages, as well as the ease with which these messages spread.”
Mr. Shalabi’s lawyer, Tariq Bargouth, said the basis for the conviction and punishment never established that Mr. Shalabi’s posts had encouraged any specific militant attack. There have been a series of so-called lone-wolf attacks in Jerusalem, in which Palestinian men, without any political backing or leadership, attack Israeli civil-ians or security officers. Avner Pinchuk, a lawyer with the Association for Civil Rights in Israel, which follows freedom of speech cases, said it was the first time he had heard that “incitement to terror in social media concluded in jail.” Majd Kayyal, the media coordinator for Adalah, an organization that pursues the legal rights of Palestinians in Israel, accused security services of a double standard, saying they had not cracked down on Israeli Jews for incitement to violence online. He said his organization had tracked officials from the police and ambulance services who had encouraged violence against Palestinians on their Facebook pages, without punish-ment.
Mr. Kayyal said he also feared government officials were using the word “incitement” too loosely, saying they had to “prove a relation between what was written, and an incident that happened in reality.”
© The New York Times
11/5/2015- There is a connection between Google searches for the N-word and regional black mortality rates, according to a study by a university professor published in PLOS One last month. “We found that areas with a greater proportion of searches containing the N-word had not only a higher black mortality rate but also a greater gap in the black-white mortality rate,” David Chae, the study’s lead researcher and an epidemiology professor, wrote in an email. Researchers analyzed mortality rates from leading causes of death among blacks, including heart disease, cancer, stroke and diabetes, and adjusted for other relevant factors such as age, sex, the percent of the black population, levels of education and poverty. They also examined the gap between black and white mortality rates in each of the 196 areas assembled by the National Center for Health Statistics.
According to the study, the researchers found that each one standard deviation increase in area racism equaled about an 8.2 percent increase in the black mortality rate. Measuring racial attitudes can be tricky, Chae wrote. Past methods have included surveys, but he wrote that those can yield subjective results, as people are more likely to self-censor their more “socially unacceptable beliefs.” “Racism is a public health issue,” Chae wrote. “This study adds to evidence that racism is a social toxin that increases susceptibility to disease and generates racial disparities in mortality. It also points to the utility of using Internet-search-based measures to monitor racism at the area-level and assess its impact on health outcomes.”
Rashawn Ray, a sociology professor at this university who teaches a class about modern perceptions of race, said Chae’s study provides a link between the words people use and prejudicial attitudes and behaviors. “This study shows that words actually matter, and words have detrimental effects on the life outcomes of blacks,” Ray said. “Because people who use racist words seem to also be likely to hold prejudiced attitudes and exhibit discriminatory behavior.” Though this state doesn’t indicate the same high level of racism like other areas in the country, , Chae’s study showed that racism still exists here too, said Stephen Thomas, health services administration professor and this state’s Center for Health Equity director.
The findings serve as evidence of racial discrimination across America, Thomas said. Racism persists in housing, employment and the criminal justice system, Chae wrote. But some forms of racism are more subtle — Internet searches might reflect the hidden instances of racism, Chae said. “In a democracy like ours, a country made up of people seeking freedom from around the world, you should not be able to predict my life expectancy by my zip code,” Thomas said. “You should not be able to predict my quality of life based on a geographic map of people Googling the word ‘n-----.’”
Thomas said he hopes Chae’s work helps people understand how prevalent racism is in the U.S. “The consequences of that word impacts people’s life and longevity,” Thomas said. “What Dr. Chae’s work shows to someone looking at that map is there are more Baltimore uprisings to come, more Fergusons to come unless we do something now to start ending racism in America.”
© The Diamondback
There are plenty of outright racists who proudly own their bigotry and hate — you can find them in any corner of the internet. And then there are those who seem to think they should be able to express their messed-up views, be taken at their word when they half-apologize or try to explain them away, and suffer no criticism or repercussions.
11/5/2015- For the latter group, the internet has made living their dream increasingly hard. Georgia high school principal Nancy Gordeuk is the latest example of that. Video of the commencement ceremony at TNT Academy, a small private school in Stone Mountain, Georgia, shows her mistakenly dismissing the crowd before the valedictorian's speech, and then saying, "Look who's leaving — all the black people!" as she tried to correct her error while a racially mixed group of attendees continued their exit from the venue. When the footage went viral, and she was criticized for being both racist and wrong, she promptly blamed her comments on Satan: "The devil was in this house," she told local news station CBS 46, "and he came out from my mouth."
If anyone was convinced that Gordeuk had actually been momentarily overtaken by evil forces, her son swiftly ruined that theory. He defended his mom in a Facebook post, writing "y'all ni**ers aren't talking about shit so if u got something to say come see me face to face," and "my moma not racist one bit she's done nothing but help kids so y'all need to get stories straight." His easy use of a racial slur certainly did not do the job of convincing anyone that Gordeuk's not racist.
Between YouTube and social media, explicit expressions of racism are becoming increasingly harder to get away with. Think of the Oklahoma frat boys expelled when a cellphone video captured them chanting, "There will never be a ni**er at SAE ... you can hang him from a tree, but he'll never sign with me; there will never be a ni**er at SAE"; the congressional staffer who stepped down after news outlets published Facebook posts in which he likened his black neighbors to zoo animals; and the Ferguson, Missouri, municipal court officials who were fired after a search of their email revealed "jokes" based on dehumanizing stereotypes about African Americans.
There's no question that it can satisfying to watch someone who's expressed bigotry be publicly humiliated in the same way that their words humiliated others — that's a huge part of why the recent commencement disaster has made national news. But here's what would be even more gratifying: if these people stopped feeling so confident that explicit racism was something they could get away with — or, even better, if they worked as frantically to rid themselves of their biases as they do at their futile efforts to excuse and explain them.
Hackers have put child abuse images on the memorial website of the Mauthausen concentration camp in Austria, 70 years after the Nazis' WW2 defeat.
8/5/2015- The website was quickly deactivated by the company managing it, a message on it reads. The Austrian interior ministry is investigating the attack. Interior Minister Johanna Mikl-Leitner called it a "criminal, sick attack and deeply abhorrent". The Nazis killed more than 100,000 people at Mauthausen in 1938-1945. The camp - one of the most notorious and biggest in the Third Reich - was liberated by US troops in May 1945. On Sunday there will be commemorations of the liberation at Maut-hausen. The exact death toll is not known. Inmates, many of them Jews, were starved, tortured or gassed to death. Others died of exhaustion through hard labour. Many Soviet prisoners-of-war and Spanish Republicans were also among the victims. Many sub-camps were also set up near the main camp. Austrian officials suspect that far-right extremists may have carried out the attack, as such extremists have hacked several websites in recent months, Austria's Der Standard news website reports. The head of Austria's Mauthausen Committee - a group promoting human rights and democracy - condemned the hack. Willi Mernyi said it was "obnoxious and reveals the mindset of the hackers".
© BBC News
Cyber-bullying is a problem confronting countless young people.
8/5/2015- One parent knows this all too well. Her teenage daughter was the victim of cyber-bullies. What they did almost drove her 13-year-old daughter to commit suicide. The parent, a woman from Port St. Lucie whose first name is Jill, asked us not to use her last name. She told us her daughter was being bullied at school in person and on Instagram, where other students were posting photos of her with insulting, degrading comments. “I think it’s terrible. It’s basically a hate crime when you’re sitting there making fun of someone and belittling them,” Jill said. Her daughter goes to Oak Hammock K-8 School in Port St. Lucie. The Instagram account where the humiliating comments were posted was called “OakHammockHoes.”
When she realized what was going on, Jill went to the school, but she says they referred her to the police. About two weeks later, her daughter was so tired of the abusive comments on Instagram and so depressed that she tried to commit suicide by drinking bleach. She ended up in the emergency room. “What would you say to the parents of the kids who were doing this to your daughter?” we asked. “They need help. There’s something wrong with someone who feels like they have to make fun of someone else. It’s not right,” Jill explained.>
She has this advice for other parents, about how to protect their children from cyber-bulllying. “Get involved with what’s going on at school. Every day that your child comes home from school ask them questions. Get involved with your child. Get involved. You don’t know how bad it is until it happens,” Jill said. Her daughter has been released from the hospital and still goes to the same school, because there are only a few weeks left in the current school year. She plans to enroll her in a different school in the fall. Cyber-bullying is a crime. So far no one has been arrested.
© CBS 12 News
8/5/2015- Pong, which first appeared in 1972, revolutionized entertainment in a way not seen since television was first introduced. The game consisted of two players each controlling a rectangular panel that would move up and down the screen trying to deflect a ball from entering their goal. It was considered the ultimate test of skill against your friend, or whoever else was next in line at the arcade. Over the course of a few decades, the video game industry exploded with popularity. It went from utilizing a gigantic box in a crowded public room to operating on a slightly less gigantic box in the comfort of your living room. The rectangles evolved into plumbers, plumbers to gorillas, and somewhere along the line we found ourselves stealing cars and robbing banks in 1080p.
It’s been a long ride, and it doesn’t seem to be slowing down, especially with the rise of Virtual Reality in the form of facebook’s newly acquire Oculus Rift headset. However, as nice as new technology is, it seems that the more things change, the more they stay the same. Perhaps the most clear example of this is the lack of variety in terms of gender and race. It’s an issue that is ever prevalent in the major video game blockbusters, and has been since video gaming began (save for an Italian plumber and his brother). Whether it’s The Last of Us, the Call of Duty franchise, the Far Cry series, or the Metal Gear Solid series, they all of have one thing in common: A strong, white male lead role.
Now, that’s not to say there are no Hispanic or Black characters out there. Plenty of games include characters of various races, although not necessarily in the lead role. Most recently, we’ve seen Telltale games’ The Walking Dead host a dual lead role, Lee Everett, who wasn’t white; and even a female character, Clementine. Both characters have helped the series win a slew of awards, including a multitude of different “Game of The Year” awards. Following the overwhelming success of the first season of The Walking Dead, they produced a second season that starred an eight-year-black girl, who proved singlehandedly that there need be no standard for a leading role in movies, video games, or books.
Race aside, the video game industry has seen the leading characters who were women and other support characters for that matter be produced and displayed in an excessively sexual way. With anything from skimpy clothing to large breasts and butts being commonplace for female characters. Some major examples are Bayonetta, Street Fighter, Dead or Alive, Metal Gear Solid, and the well-known Tomb Raider series. Within the past few years alone, however, we have seen a change in the way the race and sex come into play with various video game leads in the way Clementine from The Walking Dead, Sheva from Resident Evil 5, and Ellie from The Last of Us. It’s been a slow climb towards the concept of equality in this particular entertainment medium. However, if the last couple blockbuster hits have indicated any-thing, it’s that you do not need to look a certain way to be the action hero.
© The Kaleidoscope, news site of Kishwaukee College
4/5/2015- In a new video, Jim Sterling highlights a game that is actually hate speech. Some people often claim a game is inappropriate or sends the wrong message, but a game called Kill The Faggot -- currently on Steam Greenlight -- has a pretty clear and concise message: LGBTQ+ individuals are different, weird and deserve to die. The description on Steam Greenlight doesn't mince any words on its intention to offend and appeal to the lowest common denominator within the community: "Hate gays ? Want to unleash your frustration with the "LGBT" community? Well now is your chance. Murder gays and transgenders, while avoiding killing straight people. Get as many points before time runs out!"
The game promises "mediocre 3D graphics, three levels of play, fully voiced lowbrow innuendos, and an "amazing soundtrack."
Kill the Faggot is the creation of Skaldic Games, which, according to its website, is a game development company from the Los Angeles area. Skaldic says that its game is part of another game called " The Shelter: A Survival Story." It is likely that by the time you read this story or shortly thereafter this game will be pulled from Steam Greenlight. For the time being, it can be found here (we've also archived it here for posterity). There's no doubt that Kill the Faggot violates Steam's submission rules for Greenlight in that it uses incendiary language and imagery meant to incite. Whatever Valve decides, we will continue to follow this story as it develops.
© Game Politics
Students have come face to face with the potential perils of social media in a series of hard-hitting workshops designed to keep them safe.
4/5/2015- Cyber safety experts spent the day at Darlington School of Mathematics and Science (DSMS) highlighting the potentially negative physical, social and psychological conse-quences of using the internet. About 120 Year 8 students were introduced to Andrea Jennings and David Duckling, of Harbour Support Services, an organisation that works in Darlington providing outreach programmes for the victims and perpetrators of domestic violence. They also worked with Durham Police cohesion officer Chris O’Brien and town centre beat officer Alice Turner looking at the impact of hate crimes. Durham Police neighbourhood policing team officer Kathryn Davies and beat officer David Gibson delivered the third workshop addressing inappropriate use of social media including sexting and how easy it was to fall foul of the law.
Cyber safety is a particularly poignant issue in Darlington, following the death of teenager Ashleigh Hall, who was murdered in 2009 by a man she met online. DSMS assistant head teacher Emma Hickerson said: “As a resource the internet is as incredible as it is dangerous and it is vital our young people know how to use it appropriately. “They live in a cyber-world and the speed of technological development is breath-taking. We have to make sure they are fully equipped to maximise the incredible benefits of the internet but also stay safe from the many pitfalls.” Mr Duckling explained that domestic violence could be physical, emotional, financial and sexual. It affected men, women and children and Harbour was there to support victims and work with perpetrators.
Students heard that hate crimes often turned prejudice and discrimination into persecution, hatred and destruction when society should be celebrating diversity. PCSO Gibson stressed the importance of young people keeping their online profiles private. “Older people often befriend younger people on the internet to exert control over them,” he said. “Often in chat rooms people are not who they say they are and could be paedophiles so you need to be 100 per cent sure of who you are talking to. “The impact on young people’s lives can be huge and it also affects their families and friends.” He urged students to either click the CEOP button on their computer if they had concerns or approach an adult they could trust. He warned that inappropriate images, even when they were taken as a joke, were likely break the law and anyone who sent them could find themselves charged with distributing indecent images. Students were also shown poignant videos covering a variety of cyber safety issues and hate crime scenarios.
© The Northern Echo
One day I awoke to a barrage of posts from strangers accusing me of racism for an article I didn’t write. Then I learned how to use social media to my advantage
By Josh Bornstein
5/5/2015- In the early hours of Friday 10 April, as I slept in Melbourne, American author Naomi Wolf was posting on Facebook to condemn me as “deranged”, “genocidal” and “psychotic”. Wolf and I have never met or communicated before. Regrettably, she was not alone. In the course of that night, I was on the receiving end of a battery of threate-ning emails from strangers, accusing me of base and hateful racism. My Twitter feed was filled with similar messages from all over the world. That Friday morning, I awoke to find myself caught in the middle of a social media storm. Hours before Wolf wrote on Facebook and in another part of the world, the Times of Israel, a publication with which I was also not familiar, published an article on its website containing a graphically violent and racist diatribe against the Palestinian people and calling for their “extermination”. The despicable article was attributed to me and was accompanied by my photograph. It was quickly disseminated in the hothouse that is Middle East politics and spread throughout the globe.
The barrage of threats that followed the article’s publication came predominantly from Europe, the US and the Middle East. One threat emanating from Little Rock, Arkansas, excoriated me as a “worthless piece of shite” and advised that I “would be dead soon”. Prior to receiving this missive, the sum of my knowledge about Little Rock was almost entirely derived from the autobiographical details of novelist Richard Ford and the Clintons. Those closer to home who know me are aware that I have never written a racist article in my life. On the contrary, I deplore racism and have been very vocal in support of strong laws against racial vilification and race hate. I have also criticised the Israeli government’s conduct towards the Palestinian people, most recently during the 2014 Gaza conflict. I had become a victim of identity theft.
That morning, having well and truly woken up and then worked out what had happened to me, I posted on Twitter in forceful terms to explain that I was not the author of the racist bile. This was the cue for highly agitated editorial staff at the Times of Israel to make some urgent attempts to speak with me. They had already torn down the article and, conscious of our different time zones, were waiting for me to wake up. By the time we spoke, they had already prepared an article to explain the “hoax” that had been perpetra-ted on their publication and on me. They sought my permission to name me and to include a short statement from me. The published article also apologised both to readers of the Times of Israel and to me.
It transpired that some weeks earlier, a person or group using my identity had made an online application to blog on the Times of Israel website. The application was checked and appeared authentic. Whatever process was followed, I did not receive any contact from the Times of Israel to verify my identity and blogging application. In the following weeks, articles that I had written and had published in the Guardian and other media organisations were posted on the Times of Israel site. The articles addressed wealth inequality in Australia, the success of business lobbyists in shaping public policy, the inhumane treatment of asylum seekers and other matters of political economy. My articles are easily accessible on the websites of media organisations that publish them and are also displayed on my personal website. Although I periodically write for various media outlets, I am not a blogger. A lawyer and occasional writer, yes. A blogger, no.
The editors at the Times of Israel thought it curious that an “Australian blogger” was posting articles on its site dealing with domestic political issues in Australia. On the other hand, a senior editor there told me that her father was a Jewish “labour lawyer” in New York and I was part of a rich Jewish tradition. In the weeks during which my real writing was published on an Israeli media website, I remained blissfully unaware. Then, having established an apparently respectable identity as a blogger on the Times of Israel website, the perpetrator struck. The article opened with observations about Talmudic law before descending into a litany of repulsive race hate. The article was so rancid that some queried whether it was a failed attempt at satire.
Social media shaming can escalate and spread all over the world at eye-watering speed. In the maelstrom that engulfed me for a time, I felt like I was standing in an amphitheatre surrounded by a hostile and highly multicultural audience who were baying for my blood. And the crowd kept growing – minute by minute. Where, as in this case of identity theft, the shaming is entirely misconceived, there is an upside: it can be curtailed quickly, too. A number of people suspicious about the authenticity of the racist rant did some fact checking for themselves. Even before I emphatically communicated on Twitter that I had never written or blogged for the Times of Israel, they had advised the digital mob that a hoax had been perpetrated. My denials followed. The Times of Israel then published its article explaining the hoax and apologising. Other interventions occurred online and over time, the mob muted.
The storm was all but over within 36 hours. Unlike other victims of social media shaming, I did not lose my job. On the contrary, my work colleagues rallied around me. That said, there is another digital twist to this bizarre and disturbing experience. Before the offending article was torn down, an image of it was placed on another site. Despite vigorous attempts to have it removed from the internet, it still continues to be peddled online. As a result, I have received more threats. A genuine blogger, Daniel Sieradski, was promp-ted by my experience to do some online detective work about this episode. He discovered that a few weeks before the fake blog began to be published by the Times of Israel, a post appeared on a website foreshadowing what was to come: “Using a fake Jewish name, profile, and photo, I got myself a blog on The Times of Israel,” the post read. “These people believe I’m really a Jew.”
Sieradski’s work led me to a site that appears to have been created by a neo-Nazi group based in the US. In one of their posts, the group denigrates me as a “subversive Jewish parasite”, a “human rights activist”, “open borders advocate” and “staunch supporter of hate speech laws”. The same photograph of me that was published by the Times of Israel appears on this website; this time with a yellow Star of David emblazoned on my forehead. As unpleasant as it is to be targeted in this way, more than anything, the experience has profoundly reinforced the kindness of strangers. A human rights lawyer based in Sydney who saw Naomi Wolf’s Facebook post about me intervened and prompted Wolf to retract her condemnation. It was replaced with “Progressive Australian Jewish Lawyer Josh Bornstein is a victim of a hoax that called for genocide.” I would have preferred a full apology but once again, Wolf was not alone in not offering one.
Many other strangers, including Palestinian and other Arab activists for Palestinian statehood, acted quickly to defend me from further attacks. They told me of their concern for my welfare and their determination to disseminate the truth. One such activist for Palestinian rights sent me the following: “Josh, saw your situation and ensured to share the facts here in the UK among the various groups sharing that awful Times of Israel blog in your name.” Could I have done anything different to avoid this attack? I suspect that like many others, significant aspects of my identity, including photographs, are there to be found on the internet. I am outspoken and I am Jewish. Am I going to change any of that? Not so long as my tuchus points south. While my public identity, writing and activism undoubtedly elevated the risk of identity theft, social media participation is a two edged sword. As journalist Sarah Seltzer observed:
“But imagine if Bornstein hadn’t been active on Twitter, or easily findable online – and then imagine if the screed posted in his name had been just a tad more subtle and less obviously fishy. In such a case, the post might have stayed up for much longer and made a more lasting digital imprint under his name.” White supremacists don’t do nuance. For that too, I can be thankful.
A reply from Naomi Wolf
What happened to Josh Bornstein, who wrote a piece today about having had his identity stolen and used as the byline for a hatefilled diatribe, was awful. As soon as Bornstein’s piece was posted on social media, in fact on the same day, I asked on Facebook if the piece was a hoax, and asked for citizen journalism confirmation of the piece. I wrote as soon as the hoax was confirmed, that the piece was not authentic and I wrote about how awful it was that someone stole Bornstein’s identity. I noted that Bornstein spoke up against human rights abuses.
Bornstein never contacted me, but I would have been, and am now, very happy to offer an apology to him for my initial horrified response to a very racist piece. I would add this regret to the regrets I expressed at the time that a fellow writer’s voice was hijacked. I think that since I asked right away if this blog post was a hoax, and questioned the veracity of the op ed from the outset, I did what was journalistically ethical and appropriate. But what happened to Bornstein was terrible and I am truly sorry to have added by my distress at the racist language in the blog post, to his understandable distress at the theft of his identity.
© The Guardian
4/5/2015- The number of anti-Semitic attacks in the form of verbal attacks, harassment and threats rose in the Czech Republic in 2014, mainly on the Internet, while there were quite few physical attacks, the Prague Jewish Community says in its annual report and adds that most Czechs do not share anti-Semitic views. The Czech Republic ranks among the countries where anti-Semitism is not significantly present either in the majority society or among politicians, says the report the Jewish community released to CTK Monday. The number of physical attacks on Jewish targets, persons or property remains almost unchanged compared with the previous years, the report showed.
# In 2014, the Jewish community registered one physical attack on a person, the same number as in 2013.
# Five attacks on property were registered, which is two more than in 2013. Most of them were the defacing of graves at Jewish cemeteries.
# "The number of registered incidents such as verbal attacks, hate e-mails and threats addressed to people of Jewish origin has risen sharply, almost four times," the report says.
# The number of anti-Semitic attacks on the Internet rose by 20 percent against 2013. This trend has continued for four years now, according to the report.
# "In 2014, there was also an increase in the number of so called new anti-Semitism aimed against the State of Israel," the Jewish Community writes.
That is why manifestations of anti-Semitism were more intensive during the escalation of the conflict in the Middle East. Some groups view Czech Jews as envoys of Israel and blame them for Israel's political decisions, the report says. In 2014, the Jewish Community registered 28 cases of harassment and nine cases of threats, compared with six and three in the preceding year, respectively. The total number of these incidents was four times higher in 2014.
Like in 2013, anti-Semitism in 2014 was not a matter of right-wing extremists only, but it was also spread by leftist groups and individuals without any obvious links to extremist groups. "This may be because anti-Semitism targeting the State of Israel is a form of anti-Semitism that is more accepted in society," said Petra Koutska Schwarzova, from the Prague Jewish Community. Unlike abroad, anti-Semitism in the Czech Republic has not taken violent forms for the time being, the report says. Although the number of physical attacks is low in the country, a terrorist attack by radical individuals is the biggest threat to the local Jewish community, it says. "Examples of such attacks abroad show that security measures in the vicinity of sensitive places, such as Jewish buildings, must not be underestimated," Koutska Schwarzova said.
© The Prague Daily Monitor
Darren Fletcher, 25, of Wednesfield, was previously jailed for KKK YouTube video
1/5/2015- A lout who dressed up in a Ku Klux Klan outfit and pretended to hang a life-size golly doll has been jailed over racist rants on Facebook. Darren Fletcher was locked up for eight months for breaking an order banning him from making race-hate remarks online. He previously served a year in prison for stirring up racial hatred by posting the KKK video on YouTube. But the 25-year-old flouted the terms of his criminal anti-social behaviour on his release by launching more racist tirades. He even posted that a newspaper needed “bombing” for its coverage of his original court case. Fletcher was sent back to prison for eight months at Wolverhampton Crown Court today. The forklift truck driver, who has Asperger’s syndrome, had earlier admitted breaching the terms of the CRASBO at the city’s magistrates court.
The sentence was welcomed by Det Chief Supt Sue Southern, head of the West Midlands Counter Terrorism Unit. She said: “Fletcher blatantly flouted the conditions the court imposed on him by posting racist and anti-Semitic comments. “We understand how offensive and distressing this type of behaviour can be and worked to bring him before the courts for a second time. “West Midlands Police takes all forms of extremism seriously and we urge anyone with any concerns to contact us on 101.” Fletcher, of Kitchen Lane, Wednesfield, is also known as Christopher Phillips and Darren Clifft. He set up a Facebook page using the name The Whitest Knight and used it to express sympathy with a fellow far right supporter who was posting anti-semitic tweets about a Jewish MP.
Comments on the page included accusing the British Government of doing more to help Jewish and black people than “its native whites”. Fletcher also declared that he “hated Britain with a passion” and made threats against Jews and black people. Nicholas Towers, defending, urged Judge John Warner to pass a non-custodial sentence. He said the postings breaching the order had been intended for an audience sympathetic to convicted extremist Garron Helm. “This wasn’t someone on the streets shouting abuse,” Mr Towers added. “It was supposed to be for a relatively limited audience.” But the judge said Fletcher had “deliberately, defiantly and fragrantly disobeyed the order”. He added that anything less than a custodial sentence would have been a “green light to carry on”.
© The Birmingham Mail
29/4/2015- Brno's Masaryk University (MU) has launched a training centre for experts to simulate serious cyber attacks and train defence against them, MU representatives told media Wednesday. The cybernetic polygon has cost 22 million crowns. Its closed laboratory enables to train the defence without endangering the infrastructure outside. It can be used by scientists, students, company specialists and employees of the National Security Office and other state security bodies. The laboratory has unique software that enables to simulate any network and situation, including an attack on a nuclear power plant or the electric grid operational system, for example. "Very important is the [centre's] safe separation from real networks. If we want to train defence, we also have to create offensive means, which would cause a big problem if they penetrated the real network," said Vaclav Racansky, head of the security section at the MU's Institute of Computer Science. In the past, MU experts assisted in preparing the Czech law on cyber security, which binds infrastructure operators to ensure its safety. Hackers cause hundreds of billions of dollars worth of damage a year. The Czech Republic, too, has experienced extensive cyber attacks. In 2013, for example, a four-day attack targeted news servers, mobile operators and banks.
© The Prague Daily Monitor
By Jake Bennett, James Rund and George Dean.
26/4/2015- There have been a series of hate speech incidents over recent weeks in Tempe and Mesa, orchestrated by Neo-Nazi groups and hate preachers. Some of the incidents included anti-Black, anti-immigrant, anti-LGBT and anti-Muslim speech and intimidation. This behavior and these sentiments do not reflect the values of our community. We the undersigned have joined together to express our opposition to the presence and activities of hate groups in our community. We are deeply concerned about recent manifestations of hate. As community leaders, we have united to underscore our common value of working together to create a community of respect.
Our shared fundamental principles require us to speak out when we see hate around us. Ignoring the presence of hate speech, with its accompanying literature and social media does not make its vile message disappear. We will not sit idly by when hate raises its ugly head in our community. Not only are we united in denouncing hate, but we are united in supporting a community that is committed to the free exchange of ideas, the principles of inclusion and the celebration of diversity.
— Jake Bennett, ADL Arizona
— James Rund, ASU administration
—George Dean, Urban League
—And 22 other co-signers
© Arizona Central
A new survey suggests that Canadians who see hateful content on the internet tend to stay mum about the material.
24/4/2015- Two-thirds of the respondents to the Leger Marketing poll said they ignore hateful or racist online postings. Just 11% said they reply and react to the mate-rial and an equal number said they "tell the responsible authorities to remove it." Among those who ignored online hate speech, 15% said that responding is a "waste of time," would be "pointless" or they admitted that "I don't care." Another 17% said that reporting inflammatory internet comments would give undue attention to the remarks or would worsen the situation. Canadians gave other reasons including "can't prevent others from propagating hate / racism" (7%) and "don't want to be invol-ved" (6%). The Association for Canadian Studies and the Canadian Race Relations Foundation commissioned the March 16-18 survey of 1,711 Canadians. The poll was provided exclusively to Postmedia.
A recent Leger poll indicated Canadians had a paradoxical attitude towards racism. The survey, taken in September, indicated the vast majority of Canadians don't believe they're racist but up to a third admit making racist remarks and to supporting racial stereotypes. Jack Jedwab, whose Association for Canadian Studies com-missioned both polls, told Postmedia that people are becoming desensitized by the sheer volume of internet hate. "The strength of that message confronted with what ... people are seeing on social media, it creates a sort of indifference," said Jedwab. "We suffer as a society when there's an increasing feeling of indifference in the fight against racism." The anonymity of social media, and the lack of human moderators, can be a blessing and a curse, the researcher added. "I'm not knocking social media but we can see that this is a particular area where there's an abuse that we haven't found a way to address," he said. "A lot of the comments you're seeing on social media would never make it into print."
© The Toronto Sun
17/4/2015- The Latin American chapter of the World Jewish Congress launched a Spanish-language website to combat Holocaust denial. The website went online on April 16, Israel’s national day of commemoration for Holocaust victims. The site, www.seismillonesnuncamas.com, which means “six million never again,” targets Spanish-language websites, where Holocaust denial is increasing, according to the WJC site’s initiators. “We launched a website to show how the hate is spreading on the web,” Ariel Seidler, director of the Web Observatory, a watchdog group set up by several Latin American Jewish groups, told JTA. “We encourage people to report videos that spread hatred and encourage the addition of positive content.”
According to the Web Observatory, some 350 Spanish-language videos denying the Holocaust have received nearly 10 million views collectively. Some of the films are tagged with the word “holocuento,” a term used to lampoon the Holocaust or suggest it did not happen, similarly to “holohoax” in English or “shoananas” in French. In 2011, a civil court in Buenos Aires ordered Google to eliminate anti-Semitic search suggestions from its Argentine browsers and drop some 76 websites described in the complaint as “highly discrimina-tory,” including some that deny the Holocaust. Nazism and Holocaust denial are still alive, “and we can see this on the Internet,” said Claudio Epelman, executive director of the Latin American Jewish Congress, the regional branch of the World Jewish Congress.
© JTA News.
By Krystle Mitchell
10/4/2015- YouTube is a well-known platform amongst many people worldwide. Many people use the site for a variety of reasons whether to build an audience, broad-cast their talents, watch shows, learn something new, or listen to music. Lately YouTube has had some users concerned with the company creators being prejudice. As hate marks are constantly on the site, the platform has not found a way to fully protect its users from verbal abuse. YouTube creators have been questioned about racism due to their lack of promoting non-white individuals and their guidelines of banning hate remarks.
YouTube is not responsible for promoting everyone. When a user joins the platform, he or she must have somewhat of a following already. The site is designed to promote channels and users that are worthy of a worldwide audience. Moderators also help those that should get noticed by placing them on the home screen, or sharing them on Twitter. However, there are only a few darker skinned users that are shared on the shared sites. It was not until February 2015 when Akilah Hughes, Fusion contributor, and YouTube user took the initiative to question YouTube creators about their lack of promoting brown skin creators. Hughes generated a study of the amount of shares YouTube has on its Twitter and homepage, and kept a record of how many of the people were non-white out of all the shares in the month. Her results proved the creators of the platform might have racial animosity against those that are not white.
Once Hughes’ information was completely gathered, she presented her findings to a spokesperson and asked the creators about their racist behavior. The spokesperson said YouTube is available to anyone around the world to upload videos, gain a following and profit for their content. Since it is very open, it has accumulated a large diverse library reflecting a broad spectrum of cultures, beliefs, classes, sexualities, and races that are underrepresented elsewhere.
While Hughes is the first to do the study and question the creators about their lack of promotion, she is not the first user to question the creators about racism. CNN interviewed a famous user who was harassed about her race from viewers. When CNN asked YouTube creators about the harassment and why they have not created a better protection realm, they said the guidelines are clear that hate comments are not allowed and should be reported. Users have the ability to block, delete, and refuse comments altogether. This suggests that the site can not go further than what it already does to protect users from obscene comments if they do not do any of the following.
A researcher that focuses on social issues online stated that YouTube is a reflection of the culture people live in. The hate comments are visual proof that many people out there are very racist. Social media platforms are outlets for those people to convey their hate, because the only punishment they get is being blocked by the per-son that they dislike. The person is still allowed to see other videos of the YouTuber in question after being blocked from comments. Therefore, many users ignore the prejudice comments and continue their craft since there is no way to stop them altogether. YouTube creators getting questioned about their racist behavior and acts on the hate crimes is only a stepping stone for what is to come. A new upgrade in the guidelines must take place to prevent further threats, and better ways to promote those that are not being recognized.
© The Guardian Liberty Voice
Italy's far-right leader Matteo Salvini has been temporarily banned from Facebook for using the word “gypsies”, he claimed on Thursday.
9/4/2015- Salvini, leader of the Northern League, said his personal Facebook profile had been blocked for 24 hours after he wrote “gypsies” (“zingari”). Turning to Twitter, the politician said the move was “absurd!” A spokesperson for Facebook was not immediately available to confirm whether Salvini had been temporarily banned. Salvini came under criticism yesterday for stating that if given the chance he would “raze the Roma camps to the ground.” Speaking on International Roma Day, Salvini said around 40,000 ethnic Roma currently living in government-run camps should rent or buy homes. Members of the community, however, face barriers in applying for social housing, even though many people living in camps were born in Italy. Associazione 21 Luglio, a Roma rights group, has called for the camps to be closed and residents to be given equal access to housing. The association on Wednesday accused Salvini of courting voters and said the Northern League had in the past proposed maintaining the camp system. In a bid to tackle discrimination against the Roma community, Rome’s mayor last year banned the word “nomads” (“nomadi”) being used in city hall. Mayor Ignazio Marino said Roma, Sinti and Caminanti (travellers) were more accurate terms which could help promote integration.
© The Local - Italy
2/4/2015- Uber. WhatsApp. Twitter. Google. Snapchat. Instagram. Facebook. Many of the online services most popular among Europeans were created in the United States. The EU wants that to be different in the future. The next generation of software needing to be developed to operate features of the so-called internet of things (the connectivity of physical objects), and handle big data, should come from Europe, EU digital economy commissioner Guenther Oettinger said at a recent event on the future of internet. “Europe’s industrial competitiveness will in the future depend to a large extent on the capacity to develop high quality software and using the most modern computing technologies”, Oettinger said in a speech at the Net Futures event in Brussels on 25 March. To do that, the EU has had a set of software tools created to make it easier for entrepreneurs to transform their idea into a working application. The project is called Fiware – sometimes spelled Fi-ware – a contraction of the words 'future internet' and software. However, critics say the project, which is costing EU taxpayers €300 million, is superfluous because alternatives already exist.
Dutch entrepreneur Michel Visser is the founder of Konnektid, a website which allows its members to find neighbours willing to teach them something - like how to play guitar, speak another language, or how to knit. His company is now building an app version for mobile phones. The programme will need a system that can handle a large amount of requests. Visser has adopted a readily available system from the Fiware toolbox. “We don't have to develop it ourselves, so we win three months of development. Now we can get the app earlier out to the market”, he told this website. "We are a start-up, so we don't have a lot of money to spend." Fiware is kind of like a big box of Lego blocks, said Christian Ludtke, founder of a German company that supports start-ups. “It can be a web service for example, or a cloud service, or an interface for augmented reality”, said Ludtke.
The Fiware project is a public-private partnership between the EU and a consortium of companies that started in 2011. The software tools that entrepreneurs like Visser may use were developed by European telecommunication companies like Telefonica and Ericsson. The industry has said it is also investing €300 million in the project, which includes online tutorials on how to use Fiware, and local 'Fiware innovation hubs'. Fiware is royalty-free and open source, which means that it can be used free of charge, and developers may further develop it as well. Non-European companies can use the tools as well. “We don't mind if they are from Japan, from US, from China, from Latin America”, said Jesus Villasante, from the department of Net innovation in the European Commission.
“What we don't want is that there would be only one operator that would be able to capture value. For us the idea is that internet should be open, and therefore we should allow for open initiatives that would compete with some proprietary initiatives.” Proprietary software, as opposed to open source, can only be used if you have acquired a license. Examples include Microsoft Windows, Adobe Photoshop, and Mac OS X. "In Europe there is a strong potential for innovation, for start-ups, for entrepreneurs. We need to have this innovation capacity in an open environment, not in a closed environment”, noted Villasante.
Grants for start-ups, but only if they use Fiware
To promote the use of Fiware, the EU is investing €80 million in up to 1,000 start-ups. The money is being distributed to 16 so-called accelerators, organisations that help start-ups grow by providing funding and other support. Konnektid is one of the beneficiaries of such an accelerator, called European Pioneers, based in Berlin. One way the EU is trying to spread the use of Fiware is by making grant money - up to €150,000 per start-up - conditional on its use. “It's a kind of a trade-off. You need to find Fiware attractive and useful. If not, then you probably should be applying to a different accelerator”, said Ludtke, adding that the 12 start-ups under his guidance have so far not experienced it as a burden. Michel Visser hasn't either, although he is defiant about what would happen if he found a piece of non-Fiware software that would be better for his app. “It's business first. If it's stopping my business I would definitely say: listen, I tried it, this is what I experienced, this is my feedback, but I'm going to use something different.
That's what I would fight for. I'm a founder of a company and I need to run my business. ” The EU commission's Villasante is much less strict than Ludtke - who oversees the handing out of money to some start-ups- on the use of Fiware as a precondition. Villasante said it was more important that the start-ups tried Fiware to see if it is useful to them. “We don't believe that all the 1,000 start-ups will develop applications that will be successful in the market. There may also be some SMEs that play with Fiware, develop the product, but decide: this is not for me, I prefer to use this other thing. That's fine.” Some recipients of the EU grants have told this website that they were more interested in the grant money than in Fiware. “There are plenty of alternatives to Fiware that are also open source,” said one entrepreneur who wished to remain anonymous. “The EU is pushing software that is not necessarily the best,” he added.
© The EUobserver
Facebook is tracking users, both on and offline, contravening EU privacy rules, according to a report.
1/4/2015- Compiled by researchers for the Belgian Privacy Commission, the report says the social media giant places cookies whenever someone visits a webpage belonging to the facebook.com domain, even if the visitor is not a Facebook user. A cookie is a small file placed onto a computer by a browser and contains information that can be used to track and identify users. People without Facebook accounts are not spared. The 67-page report, first published in late February and then again with updated chapters on Tuesday (31 March), notes that “Facebook tracking is not limited to Facebook users.” Facebook places a so-called “datre” cookie, which contains a unique identifier, onto the browsers of people in Europe who have no Facebook account. The cookie takes two years to expire.
Facebook says it takes this commitment one step further. "When you use the EDAA opt out, we opt you out on all devices you use and you won’t see ads based on the websites and apps you use," said the spokesperson.
© The EUobserver
3/4/2015- A Facebook troll who targeted a disability rights campaigner has been reported to Police Scotland for alleged hate crimes. Tony Bain taunted Rachael Monk with a string of vile comments including saying: “Two Mongs don’t make a right.” But the abusive messages backfired after 32-year-old Rachael, from Dumfries and Galloway, struck back. She told Bain, from Glasgow: “I’m the lady in the video…your comments have actually made me laugh, they’re nothing new or original. “Did you know hate crime is a criminal offence? Good luck when the police come knocking! “Ignorance is a bigger disability.” Rachael, from Annan, has cerebral palsy and can only speak through a computer. She appeared in a video made by the “Now Hear Me” campaign, which promotes understanding for those suffering from speech impairment or loss as a result of disability.
In a series of Facebook messages, posted on an NHS page, Bain also commented: “Warning you will need an umbrella before watching this video.” The comments sparked outrage from campaign supporters, who called Bain “worse than disgusting” and a “trolling little scumbag”. Another asked Bain: “Have you actually looked in the mirror?” Bain, who re-vealed he was previously employed by Argos, was also asked: “Which Argos do you work at Tony? I’d love to pay you a visit.” Someone else wrote: “Send me your address private-ly then Tony Bain and we’ll see who likes hiding behind a keyboard. I’ll knock all them filthy yellow teeth out with a single bat.” Bain was also told: “You’re worse than disgusting, I’m not sure there is a word for people like you yet but there should be and it should come with a jail sentence. You are a disgrace to your family.”
A spokesman for NHS Education for Scotland said, “As soon as we became aware of wholly inappropriate posts on our Facebook page, we took action to block the individual. We have also reported the matter to the police.” A spokesman for the Now Hear Me campaign also confirmed that they had reported the incident to Police Scotland. Police Scotland was unable yesterday (Fri) to find a record of the NHS complaint. But the force added they “would assess and investigate as appropriate any complaint made to us concerning comments which could be considered criminal”. In the campaign video, Rachael says, “I am a bright person and intelligent, and I want my thoughts and feelings to be heard and not wasted. Everyone has the right to communicate.”
Rachael, speaking through her carer today said: “Although I was hurt by the comments made I wanted to respond in a positive way.” “My family and friends were extremely offended and deeply upset that someone could be so cruel about me and others with different abilities.” “It is because of people like him that makes it all the more important to raise awareness and educate.” She added: “I think Tony Bain should receive a warning, if only to make him think about his actions.” “I feel that the police should definitely be involved in such matters. It is important that everyone knows such behaviour will not be tolerated.” Argos confirmed that “Bain left the business last year”. Bain could not be contacted for comment.
© Deadline News
A Newport man has been fined after posting racist comments on Facebook.
30/3/2015- Jason Gwyer, aged 32, of Brown Close, was convicted of a racially aggravated public order offence after posting racists comments on Facebook in relation to the annual Ashura march which takes place in Newport. The march organised by the Islamic Society for Wales was to commemorate the anniversary of the martyrdom of Imam Hussain who was killed in Karbala, Iraq, more than 1,300 years ago. The details of the march were published in the Argus in November, 2014, and Gwyer posted a photo of the article along with racist comments on his Facebook page on November 12, 2014. Gwyer posted: "Need this to go viral!!!! Muslims think they are going to have a nice little march thru my city on Sunday!!! think not!!! Need as much force as possable. We need to stand up and tell these vile pigs where to go!!! Who is with me??? Please share." He was found guilty at Newport Magistrates Court and fined £165. He also had to pay costs of £620.
He was also charged with producing class b drug cannabis and possession of a class b drug which was cannabis. He pleaded guilty to both offences. He received a 12 month commu-nity order, a £100 fine and the drugs were ordered for destruction. PC Ricky Thomas, investigating officer, after the hearing, said: "Gwent Police will not tolerate any type of hate crime in our communities. We will investigate it and put evidence before the courts for the offender to be dealt with. "I hope this serves as a warning to people who think that by posting on social media sites that it is anonymous in some way - it isn't and it's still an offence. We would encourage anyone who has concerns about anything they see on social media to report it to us on 101."
© The South Wales Argus
While the amount of racially motivated crime is in decline, extremists are adopting increasingly sophisticate ways of spreading their message online, experts say, and the government is taking this into account as it adopts a new policy to combat extremism.
31/3/2015- It is hard to prove that extremist groups violate the law in such cases and and they have political ambitions, according to the Conception of Fight against Extremism which government adopted on March 18. “The development of criminality shows the trend that displays of racist discrimination and other forms of hate crime has been recently shifting from the street to virtual area,” reads the conception. The number of reported crimes related to extremism is decreasing. In 2014 police reported 40 crimes of extremism in 2013 it reported 64 of such crimes, in 2012 it noticed 49 crimes of extremism. There are exactly dozen extremist groups operating in Slovakia including political parties such as People’s Party - Our Slovakia (ĽSNS) or sport and paramilitary groups such as Slovak Levies or Action Group Vzdor, according to the government document.
NGOs dealing with extremism approached by The Slovak Spectator say that mere formal punishment and prosecution of such people is not enough. More than anything else, the public and important political figures should condemn such behavior. “I don’t think that eye of Big Brother should be the main tool against spreading of hatred on internet,” Laco Oravec from the Milan Šimečka Foundation (NMŠ) told The Slovak Spectator.
The internet is a strategic place for extremist groups because they do not have sufficient space in mainstream media. Moreover, especially young people who are more active on internet tend spread those ideas, according to Tomáš Nociar, a political scientist focusing on extremism. “Of course, using the internet in this regard is nothing new, however this phenomenon became more relevant in recent years,” Nociar told The Slovak Spectator, “because the number of people using internet every day is increasing.” The strategy proposes to improve cooperation with internet providers to be better in tracking of extremist statements and material spreading via the internet.
The Bratislava Without Nazis initiative currently runs research of statements published on Slovak websites and social networks and there are dozens of them which could violate the law, according to the group. The police however do not prosecute authors of this extremist material so it is questionable how the police work, according to Róbert Mihály, a member of the initiative. “Just open the Facebook, there are hundreds of such cases there,” Mihály told The Slovak Spectator. Mihály was one of seven people detained by police during a March 14 march as he and others confronted a group commemorating the war time Slovak state, an ally of Nazi Germany.
While some extremist rhetoric may violate local laws, it can also be difficult to prosecute. On the other hand, the power of police is limited because a large amount of clearly extremist content which could be grounds for prosecution is on U.S. based servers. Slovak legislation does not apply there and therefore Slovak authorities struggle to deal with it, according to Nociar. The police refuse to describe their methods of fighting with extremism because of tactical reasons, police spokesman Michal Slivka told The Slovak Spectator.
Humor is better than jail sentences
Instead of repression Nociar proposes the public to fight with extremism using humor or rational arguments that make it less attractive for people. Slovak society however often does not recognize the extremism as a problem. This affects also police when dealing with such cases, according to Oravec. “We lack the clarity on the certain line between a crime or at least taboo that is crossed,” Oravec said. Also the voice of political figures should be stronger when fighting with extremisms. Journalists, analysts and NGOs participate on the public debate about this issue but statements of politicians are missing or are evasive, according to Grigorij Mesežni-kov, president of the Institute for Public Affairs (IVO) think tank. “Have you recently noticed a government representative clearly describing his or her attitude towards extremism and not only in form of general statements?” Mesežnikov asked The Slovak Spectator. “To organise a press conference ad hoc after some unpleasant event and then consider the participation on fighting against extremism as complete is insufficient.”
As a part of prevention the government should bolster efforts to educate people about extremism and point to the threat it represents via mass media and schools, according to the conception. The state underestimated the power of education right after Slovakia joined the EU in 2004 and it does not sufficiently explain to public how much organizations, such as EU or NATO, did for Slovak well being. This is the reason why some Slovaks are keen to believe hoaxes, according to Mesežnikov. “We should focus on high-quality of education of young people so they will be able to critically perceive and sort information and respond to hatred on the internet,” Oravec said.
© The Slovak Spectator
30/3/2015- Less than a week after EU digital commissioner Andrus Ansip announced he wants to end geo-blocking, his fellow commissioner Gunther Oettinger indicated he was in no rush to abolish the practice of restricting online content based on someone's location. “We should not throw away the baby with the bath water”, Oettinger said in an inter-view with German newspaper Frankfurter Allgemeine Zeitung, published Monday (30 March). “I want to examine what an opening would mean for the film industry”, the German commissioner noted, adding that “we should protect our cultural diversity”. The interview comes after Ansip said in a press conference Wednesday (25 March) he wants to end geo-blocking. “I hate geo-blocking”, noted Ansip, who is one of the commission's vice-presidents, in charge of the portfolio digital single market.
Oettinger, whose portfolio is called digital economy & society, made light of Ansip's remark, by saying: “I hate my alarm clock at five o'clock in the morning.” “I wouldn't read any contradictions in this”, commission spokesperson Mina Andreeva told this website Monday. She said that “Ansip and Oettinger worked very closely” to prepare a debate with all commissioners last Wednesday, and that the entire commission that day “agreed that geo-blocking would be tackled”. However, Andreeva noted that tackling geo-blocking was only agreed “on a general level”, and that the details need to be worked out now. The details are at the crux of the matter. Geo-blocking is a technical tool that can be used for both 'good and evil'. Sometimes companies use geo-blocking to abide by the law, for example when a gambling website uses it to make sure its services are unavailable in countries where online gambling is illegal. And Ansip has also acknowledged that such practices are acceptable.
However, geo-blocking is also used to redirect online shoppers to a local website which offers the same products at higher prices, which can be illegal under EU law. Another type of geo-blocking occurs when media companies prevent consumers from watching online content like films or tv series in a territory where the company has not acquired licenses. Here, the debate gets murkier. The commission has agreed to eliminate “unjustified geo-blocking”. But defining when geo-blocking is justified, and when it is not, will only begin now. The commission is due to publish its digital single market strategy on 6 May. Then it will hold a public consultation on geo-blocking. Some, like Pirate MEP Julia Reda, oppose “all kinds of artificial barriers on the web and all kinds of website blocking”. The German deputy argues for an introduction of the so-called 'country of origin principle' for online videos. “That would mean that companies have to obtain a copyright licence only in the country from which they operate, which has long been the case for TV broadcasting,” Reda told this website in an e-mailed statement.
It is not the first time the two commissioners differ in tone on the same topic. “We need strong net neutrality rules”, Ansip said Tuesday (24 March), referring to the principle that all data is treated equally by internet providers and intermediaries. “We need an open internet for consumers. No blocking or throttling”. A few weeks earlier, Oettinger called net neutrality a “Taliban-like issue”. “This is not the first time Oettinger has contradicted the Commission and undermined its stated consensus in his home country's media”, noted Reda, and referred to it as the "latest of his PR missteps".
© The EUobserver
1/4/2015- A Facebook page that attacked aboriginal people in Winnipeg and re-ignited the racism debate in the city has been pulled down. The page, called "Aboriginals Need to get a job and stop using our tax dollars," claimed support for Kelvin High School teacher Brad Badiuk who was suspended in January after making racist comments on his own Facebook page. The page was created in December — the same month Badiuk's posting was made. Before disappearing on Wednesday, the page had close to 5,000 members and was filled with negative comments about aboriginal people.
Robert Sinclair, an aboriginal man, who came across the page on Tuesday, called it a hate crime and hopes the people behind it are held accountable. "Knowing the fact that people [were] looking at and supporting it, it doesn't say a great deal of positive outlook for the way that Winnipeg is directing themselves," he said. Just before it was pulled down, the page started getting a lot of posts critical of it, with at least one person calling the administrators "racist a—holes." A new Facebook page called Protest against "Aboriginals Need to get a job and stop using our tax dollars" started in response and was applauding the removal of the racist page.
'Inspiring, important moment'
One aboriginal leader says he's not angered by the page, but rather inspired by the opportunities it presents. Niigaan Sinclair, who teaches indigenous literature, culture, history and politics at the University of Manitoba, said it used to be that no one talked about racism, that it was swept under the rug. Now, people talk about racism and relationships every day, and that is the only way to make things better. "I actually think this is a really inspiring important moment," he told CBC News on Wednesday, adding he wants people to talk about what it means to be a meaningful citizen in this city.
Police asked to investigate
Some are calling for police to hold those responsible for the Facebook page accountable. Tasha Spillett, an aboriginal activist and educator in Winnipeg, said the page refers to death camps for aboriginal people. She says it's hate speech and must be investigated. "Completely horrendous. Like to say something like that is just attro-cious. But that's the beast, that's racism. Racism is hurtful. It's dangerous," she said. "[It's] another assault on us. Yesterday Facebook was not a safe place." But it also offers a chance to dig into the roots of racism.
Spillett says she shared screen grabs of the page with her friends on social media. and it hit a nerve, similar to what happened earlier this year when Macleans maga-zine called Winnipeg the most racist city in Canada. "You could really see the community response in Winnipeg, saying, 'Oh my goodness. This is not acceptable in our community,'" she said. "For Winnipeg to stand up and say, 'Hey, Facebook may have these community standards but these are not our community standards."
© CBC News
30/3/2015- OSCE Representative on Freedom of the Media Dunja Mijatović said today that the unilateral decisions by the Interior Ministry in France, without judicial oversight, to block five websites for allegedly causing or promoting terrorism represents a serious threat to free expression and free media. “Blocking websites without judicial oversight may endanger free expression and free media and creates a clear risk of censorship of online content by political bodies,” Mijatović said. The Representative urged the French authorities to reconsider the parts of the anti-terrorist law enabling website blocking, which was passed in November last year. “Legislation to fight terrorism should not curb free speech by introducing notions that are too vague or lead to the repression of free expression,” Mijatović said.
The Representative also noted with concern legislative debates in several OSCE participating States over provisions with a similar potential impact on the freedom of expression. These include new criminal provisions approved in Spain regarding access to or dissemination of extremist content, and certain anti-terrorist provisions in proposed Bill C-51 in Canada. Mijatović said her Office is monitoring developments regarding anti-terrorist proposals and their effect on free expression throughout the OSCE region. “I call on all OSCE participating States to exercise care and restraint when introducing anti-terrorist laws that could endanger freedom of expression and free media,” Mijatović said.
© OSCE Office of the Representative on Freedom of the Media
28/3/2015- Is nudity ever allowed on Facebook? It's a question that has got plenty of photo uploaders in trouble. In response, Monika Bickert, head of global policy management, and Chris Sonderby, deputy general counsel, wrote on the site's official blog, attempting to provide examples and "more detail on what is and is not allowed" and unveiled updated community standards. The guidelines cover everything from bullying and threatening behaviour to trading illegal goods, but the latest changes mainly concern nudity, hate speech and terrorist activity. Nudity has been a contentious area on social media recently.
The ongoing Free the Nipple campaign was launched in response to the fact that only male nipples are allowed on Instagram and in the new rules Facebook admits "our policies can sometimes be more blunt than we would like". As such, genitals and "fully exposed buttocks" aren't allowed, and campaigners won't be pleased to find that Facebook will still "restrict some images of female breasts if they include the nipple". However, they will "always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring". Plus pictures of paintings, sculptures and other arty nudes are allowed, so that trip to the Louvre doesn't have to go entirely unFacebooked.
Regarding hate speech - attacking people based on race, ethnicity, religion, sexuality, gender, or disability - Facebook says this is a particularly tricky area to police and that they regularly consult with governments, academics and experts on the topic. The standards make it clear that targeting individuals is never allowed, yet it's okay to share a post that contains hate speech, but only if your purpose is "raising awareness or educating others about that hate speech" and you make that intent clear. Recently, Twitter's founders were threatened by supporters of terrorist group Isis for blocking accounts that promote terrorist activities; now Facebook has beefed up its rules on 'Dangerous Organisations' which includes terrorist activities or organised crime.
"It's a challenge to maintain one set of standards that meets the needs of a diverse global community," Bickert and Sonderby admit - and there's no doubt this won't be the last community standards set they'll have to write. It's easy to imagine the hate and crime guidelines getting ever more stringent, while nudity rules could become more lax.
© The Belfast Telegraph