news archives >>
Headlines May 2013
Headlines April 2013
Headlines March 2013
Headlines May 2013
Teachers’ union takes aim at Internet abuse (Canada)
Local elementary teachers are joining their counterparts across the province in supporting the International Day Against Homophobia today (Friday, May 17).
17/5/2013- The support goes beyond the classroom; it extends to the Internet as well. The Elementary Teachers Federation of Ontario (ETFO) says it’s time to address the homophobic and transphobic comments that appear on the Internet on a daily basis. The teachers’ organization has a culture of acceptance, said ETFO Halton president Marg Macfarlane. “We want every child to feel welcome and comfortable when they’re attending school and no child should be made to feel unwelcome or left out.” According to ETFO, thousands of homophobic comments are reported daily at www.nohomophobes.com, a website set up by the University of Alberta’s Faculty of Education. The website supports research, policy development and education focused on sexual and gender minorities.
“While teachers are working to create safe, welcoming environments in the classroom, the next frontier for preventing homophobia and transphobia is the Internet and social media,” said ETFO president Sam Hammond, in a news release. “Ontario has laws protecting lesbian, gay, bisexual, and transgender (LGBT) youths and adults from harassment and discrimination, and those laws must extend to the Internet.” Just as homophobic behaviour is not acceptable in classrooms, ETFO says, students need to realize such activity is equally unwelcome on the Internet. “Cyberbullying and harassment of all forms are hurtful and abusive no matter where they are expressed,” Hammond continues in the release.
The ETFO has provided tips for confronting such behaviour on the Internet. The tips come from Montreal organization Fondation Émergence, which started the International Day Against Homophobia. They include:
• Reporting unwanted content to website administrators including Facebook, Twitter, and YouTube
• Reporting homophobic or transphobic content to website hosts
• Encouraging others to do the same when such content appears
ETFO represents 76,000 elementary public school teachers and educational professionals in the province. For the past decade, the teachers’ union has promoted a Positive Space Campaign and educational resources about LGBT topics for teachers.
For information on Fondation Émergence, visit www.homophobiaday.org.
© Inside Halton
Czech Jews document tripling of online anti-Semitism
14/5/2013- The Jewish Community of Prague documented a tripling of online instances of anti-Semitic hate speech last year. The increase, which the community links to a Jewish politician’s presidential bid, among other factors, was documented in an annual report on anti-Semitism published Tuesday. The community documented 82 instances of online hate speech on Czech websites in the last year, compared to only 26 the previous year. According to idnes.cz, a news site, the report attributes the increase to the presidential campaign ahead of elections last January. Jan Fischer, a Jewish politician, was considered a leading candidate but did not make it past the first round. “The presidential elections have revealed a degree of latent anti-Semitism in some groups, but certainly did not indicate anti-Semitism in the majority or mainstream political speech,” the authors of the report wrote. Other causes listed were a strategic shift in extreme-right circles to online activity; escalation of the Israeli-Palestinian conflict; and warm relations between the Czech government and Israel, idnes.cz reported. The authors recorded no physical assault or threats due to anti-Semitism in 2012, but did register six attacks on property and ten instances of harassment, mainly via email. The report further states that the overall prevalence of anti-Semitism is lower in the Czech Republic than in other European countries.
© JTA News.
Czech Interior Ministry: Extremists have won over new patrons online
12/5/2013- A Czech Interior Ministry report about extremism in the country last year will be reviewed by the Government this coming Wednesday. According to the report, Czech extremists shifted their activities more to the internet in 2012. Thanks to that move, both left-wing and right-wing radicals have managed to win over new patrons, especially among young people. The right-wing scene also reportedly exploited social networking sites. The ministerial material warns that right-wing extremists did their best last year to revive the production of neo-Nazi concerts. A total of 48 took place in 2012, 30 more than in 2011. "The organization of musical events domestically was revived, but did not succeed in fully returning to pre-2009 levels in terms of its extent or a high participation of famous bands from abroad,” the report says.
Right-wing extremists in particular are reportedly maintaining their foreign contacts with their fellow-travelers in Germany and Slovakia. "Collaboration with those abroad took the form of functioning as branches of international organizations, gathering in public, holding musical concerts, performances, or private events, and providing certain services," the report says. However, compared to 2011, none of the extremist scenes has reportedly changed much. On the right wing of the political spectrum the Workers' Social Justice Party (Dìlnická strana sociální spravedlnosti - DSSS) continues to dominate, while anarchists dominate the left-wing scene. Both camps continue to grapple with a lack of money. Police say the broader membership base of the right-wing radicals is comprised of about 5 000 people. The militant section is allegedly only about 150 people. Roughly one-third of those militants can be considered leading personalities and major activists. The situation is allegedly similar with respect to left-wing extremists.
New app for hate crime victims in Hampshire (UK)
8/5/2013- A mobile phone app listing where victims of hate crime can get support has been launched in Hampshire. The free app, available to download for free on iPhone and Android phones, has been developed for Hampshire Constabulary and was piloted in the western area of the force. It includes detailed information about crimes targeting a victim’s race, sexual orientation, transgender identity, disability, religion, age or refugee or asylum status. It also encourages people to report such crimes. PC Ahmed Sasso, who came up with the idea, said: “The communities we serve across Hampshire and the Isle of Wight are diverse and the app is another tool to help us connect with them. “The idea is to provide information in one handy place about hate crimes including how to report them, how to stay safe and what support is available for victims.” He added: “We treat all reports of a hate crime or hate incident very seriously. They are cowardly and unacceptable crimes which can affect not only the victim but also their family, friends, and in certain cases, a whole section of a community.” The Helping Victims of Hate Crime app will be officially launched on Tuesday, May 14, at the Ageas Bowl, home of Hampshire Cricket Club.
© This is Hampshire
Chelsea's Yossi Benayoun subjected to antisemitic abuse on Twitter (UK)
• Chelsea have called in police to investigate • Benayoun had just tweeted thanks for birthday wishes
7/5/2013- Chelsea have called in the police to investigate after antisemitic abuse was directed towards the midfielder Yossi Benayoun on Twitter. The Israeli revealed on Tuesday that he was subjected to abuse on the social media site, retweeting a message which contained expletives and referred to him being Jewish, after he had thanked his followers for their birthday greetings. "Some nice people in the world," tweeted Benayoun in response. It is understood the London club have contacted the police to investigate the matter further. Last month Chelsea said they would investigate after Benayoun, who is out of contract at the end of the season, said he was the victim of antisemitic taunts from his own supporters when he came on as a substitute against Liverpool.
© The Guardian
'Geography of Hate' maps racism and homophobia on Twitter
10/5/2013- Following the 2012 US Presidential election, we created a map of tweets that referred to President Obama using a variety of racist slurs. In the wake of that map, we received a number of criticisms - some constructive, others not - about how we were measuring what we determined to be racist sentiments. In that work, we showed that the states with the highest relative amount of racist content referencing President Obama - Mississippi and Alabama - were notable not only for being starkly anti-Obama in their voting patterns, but also for their problematic histories of racism. That is, even a fairly crude and cursory analysis can show how contemporary expressions of racism on social media can be tied to any number of contextual factors which explain their persistence.
Headlines April 2013
The prominence of debates around online bullying and the censorship of hate speech prompted us to examine how social media has become an important conduit for hate speech, and how particular terminology used to degrade a given minority group is expressed geographically. As we’ve documented in a variety of cases, the virtual spaces of social media are intensely tied to particular socio-spatial contexts in the offline world, and as this work shows, the geography of online hate speech is no different. Rather than focusing just on hate directed towards a single individual at a single point in time, we wanted to analyze a broader swath of discriminatory speech in social media, including the usage of racist, homophobic and ableist slurs.
Using DOLLY to search for all geotagged tweets in North America between June 2012 and April 2013, we discovered 41,306 tweets containing the word ‘nigger’, 95,123 referenced ‘homo’, among other terms. In order to address one of the earlier criticisms of our map of racism directed at Obama, students at Humboldt State manually read and coded the sentiment of each tweet to determine if the given word was used in a positive, negative or neutral manner. This allowed us to avoid using any algorithmic sentiment analysis or natural language processing, as many algorithms would have simply classified a tweet as ‘negative’ when the word was used in a neutral or positive way. For example the phrase ‘dyke’, while often negative when referring to an individual person, was also used in positive ways (e.g. “dykes on bikes #SFPride”). The students were able to discern which were negative, neutral, or positive. Only those tweets used in an explicitly negative way are included in the map.
All together, the students determined over 150,000 geotagged tweets with a hateful slur to be negative. Hateful tweets were aggregated to the county level and then normalized by the total number of tweets in each county. This then shows a comparison of places with disproportionately high amounts of a particular hate word relative to all tweeting activity. For example, Orange County, California has the highest absolute number of tweets mentioning many of the slurs, but because of its significant overall Twitter activity, such hateful tweets are less prominent and therefore do not appear as prominently on our map. So when viewing the map at a broad scale, it’s best not to be covered with the blue smog of hate, as even the lower end of the scale includes the presence of hateful tweeting activity.
Even when normalized, many of the slurs included in our analysis display little meaningful spatial distribution. For example, tweets referencing ‘nigger’ are not concentrated in any single place or region in the United States; instead, quite depressingly, there are a number of pockets of concentration that demonstrate heavy usage of the word. In addition to looking at the density of hateful words, we also examined how many unique users were tweeting these words. For example in the Quad Cities (East Iowa) 31 unique Twitter users tweeted the word “nigger” in a hateful way 41 times. There are two likely reasons for higher proportion of such slurs in rural areas: demographic differences and differing social practices with regard to the use of Twitter. We will be testing the clusters of hate speech against the demographic composition of an area in a later phase of this project.
Perhaps the most interesting concentration comes for references to ‘wetback’, a slur meant to degrade Latino immigrants to the US by tying them to ‘illegal’ immigration. Ultimately, this term is used most in different areas of Texas, showing the state’s centrality to debates about immigration in the US. But the areas with significant concentrations aren’t necessarily that close to the border, and neither do other border states who feature prominently in debates about immigration contain significant concentrations.
Ultimately, some of the slurs included in our analysis might not have particularly revealing spatial distributions. But, unfortunately, they show the significant persistence of hatred in the United States and the ways that the open platforms of social media have been adopted and appropriated to allow for these ideas to be propagated. Funding for this map was provided by the University Research and Creative Activities Fellowship at HSU. Geography students Amelia Egle, Miles Ross and Matthew Eiben at Humboldt State University coded tweets and created this map.
The full interactive map is available here
© Floating Sheep
After suicides, lawyers grapple with cyberbullying laws (USA)
A series of high-profile teen suicides in recent years have prompted lawmakers, prosecutors and educators to address the growing problem of cyberbullying.
25/4/2013- At a forum on Wednesday at Rutgers University, legal and policy experts said that state and federal laws are struggling to keep up with the dizzying pace of technological advances and the new kinds of online bullying they make possible. Tyler Clementi was a freshman at Rutgers before taking his own life in 2010 after learning that his roommate, Dharun Ravi, used a Web cam to spy on his sexual tryst with another man. "You can do a lot of damage very quickly with these new technologies," said defense lawyer Rubin Simins at Wednesday's event. Simins represented Molly Wei, a friend of Ravi who was charged with helping him broadcast the video and avoided jail time in exchange for testifying.
Ravi was convicted last year on 15 counts, including invasion of privacy and hate crimes, and sentenced to 30 days in jail. Both sides have appealed. The case sparked a national debate over how aggressively cyberbullying should be prosecuted. In New Jersey, cyberbullying is not a crime. Prosecutors charged Ravi in part under the state's hate crime law, known as bias intimidation. Retired New Jersey state Judge Glenn Berman, who presided over the case, said lawmakers told him after the trial they never envisioned the statute applying to that type of case. The law, like hate crime statutes in most states, increases the potential jail sentence when attached to an underlying crime.
Under the bias intimidation law, a jury can convict a defendant in two ways: by concluding that he or she targeted the victim out of bias, or by finding that the victim believed he or she had been targeted for that reason. The latter provision is "muddled," Berman said, since it ignores the defendant's intent. In January, in an unrelated case, the state's highest court agreed, ruling that provision was unconstitutional unless prosecutors can show the defendant intended to act out of bias. That could invalidate at least one count of conviction against Ravi on appeal, though the jury found he acted with knowledge or purpose for three other hate crime charges.
'No Clear Line'
Brian Sinclair, who heads the computer crimes unit for the Bergen County prosecutor's office in New Jersey, said it can be difficult to determine when cyberbullying rises to the level of a crime. "I don't think a young person always understands what he's doing when he hits send, post, tweet," he said. The uncertainty extends to civil liability as well. After the Clementi suicide, New Jersey passed an anti-bullying law that requires schools to develop protocols to address bullying, including cyberbullying, a move that several other states have followed or considered. That, however, could increase schools' potential liability for bullying incidents if victims can show that their schools did not do enough to stop it, said Elizabeth Jaffe, a professor at John Marshall Law School who has written about bullying.
Last week, the parents of California teen Audrie Pott, who they said killed herself after classmates shared a photo of her being sexually assaulted, filed a notice that they intend to sue school officials for mishandling their daughter's complaints. The officials claimed she never reported the bullying before or after the alleged assault. In 2011 California passed new laws that require schools to have policies for addressing bullying complaints, after a gay teenage boy who was bullied committed suicide. Emily Bazelon, a journalist who recently published a book on bullying, "Sticks and Stones," said schools across the country are grappling with whether they have the authority to discipline students for conduct that occurs outside of school, and federal courts are divided on the issue. "Eventually, the Supreme Court is going to have to take one of these cases," she said. "There is no clear line."
The use of the Internet by far-right extremists
24/4/2013- Extremists have used the internet to build membership, advance their credibility and to mobilise for direct action. The last 3-5 years has seen a growth in populist movements across Europe. In contrast to previous years, these movements are characterised by
a) Greater co-ordination across national borders;
b) A focus on values and xenophobia rather than race;
c) Emphasis on local activism and street demonstrations in addition to traditional electoral politics.
This is of concern to policymakers at both a European and national level. This has become increasingly the case following the terrorist attacks in Oslo in 2011. These groups, like the English Defence League (EDL) or the Greeks Golden Dawn, style themselves as pseudo-paramilitary organizations, and utilise the Internet and social network sites to organise demonstrations and recruit new members. Rights groups have warned of an explosion in racist violence over the last year. They say the severity of the attacks and tools used has increased, from simple fists to assaults with metal bars, bats and knives. The common right-wingers are no longer the violent skinheads of the 1990s, but middle-class youth organizing through social media. According to a study conducted by the British think-tank Demos in 2011, young people with extreme-right and anti-Islam views use the Internet extensively to spread their views and promote their cause.
In Greece reports that the interior ministry asked local municipalities to provide data about the number of immigrant children in public nurseries, allegedly in response to a request from a neo-Nazi Golden Dawn MP – Ilias Panayiotaros – raises the issue if this list would be published on the internet for access from the group members. It is possible to find online that Golden Dawn members publicly stated that they would “carry out raids on hospitals and kindergartens and throw immigrants and their children out on the street so that Greeks can take their place”. This same group stated that wants to plant land mines on the country’s borders to protect against illegal immigrants. Several Golden Dawn members are awaiting trial for crimes, ranging from armed robbery to severe bodily harm.
Victims of the Golden Dawn described the attackers as acting in an organised manner and in groups with military trousers, wearing helmets or having their faces covered. The majority of attacks occur after sunset. “Patrols” by motorcyclists dressed in black is described as a common practice. There’s also described the use of weapons such as clubs, crowbars, folding batons, chains, brass knuckles, knives and broken bottles during the attacks.
Far-right groups from Scandinavia, the United Kingdom, Germany and Eastern Europe gathered in September in the Danish city of Aarhus for what they said was a rally to make their governments aware of the threat of Islamic extremism. Greece’s neo-Nazi Golden Dawn party has set up an Italian branch in the north-eastern city of Trieste, according to local reports. Alba Dorata (Golden Dawn Italy) was founded by Alessandro Gardossi, formerly a member of Forza Nuova, a neo-fascist organisation that has ties with Greece’s far-right party. He is planning to keep in touch with Italy’s far-right movements and parties, such as CasaPound - nicknamed Fascists of the third millennium. The group recently opened an office in New York, announcing its presence with a sleek Web site depicting a stylized Swastika against a darkened Manhattan skyline. Golden Dawn has also established an outpost in Australia and Canada, where Greeks have been emigrating by the thousands to escape the crisis in their homeland.
Internet, Social Media & Propaganda
Members of Golden Dawn do their propaganda through deeds, exactly the same way that the Muslim Brotherhood in Egypt does, or Hezbollah in Lebanon. “We are providing food produced in Greece for Greek citizens only” – Ilias Kassidiaris, spokesman of Golden Dawn stated to a local newspaper. It’s also reported that he, along with other members of the group, checked the identification of people before giving a plastic carrier bag with food and other basic necessities products. It was reported that about 150 Golden Dawn members rode through neighbourhoods on motorcycles handing out leaflets. In relation to the Nazi symbolism: Holding torches, they shouted “blood, honour, Golden Dawn” – a direct translation from the German “Blut und Ehre”, the motto once carried by the Nazi SA. Singing their official hymn “Raise the flags high” – again, a direct translation of the Nazi storm troopers hymn “Die fachne hoch”. Videos, released on the internet, showed three Golden Dawn deputies and dozens of its supporters checking the documentation of foreign-born traders, then destroying their market stalls and merchandise in the central Greek towns of Messolongi and Rafina on 8 September 2012. The video was later published on the Golden Dawn website with a warning that “illegal street trade by illegal immigrants will not be tolerated”.
Brunon K, a polish citizen obsessed with Anders Behring Breivik, the man responsible for the Norwegian terrorist attacks, was arrested in a raid by members of Poland’s security services in Krakow on Nov 9 2012. Along with the suspect, agents seized four tons of explosives, detonators, one pistol, hundreds of rounds of ammunition, a bullet–proof jacket and false registration plates. He attracted police attention by praising Breivik on internet forums. The manifesto of Breivik has now been read by an untold number of people thanks to the Internet and social networks. Breivik cites the far-right group English Defense Force, protesting what they see as “uncontrolled Muslim immigration” to the United Kingdom, as an inspiration. The group began with just 50 members two years ago, but has now expanded to almost 100,000 members. The group credits this growth to their online presence on social networks.
The internet now plays a part in most, if not all, cases of violent radicalisation and is a more significant recruiting ground than prisons, universities or places of worship. Extremists are no longer isolated from others who share their beliefs. Now, they can communicate with thousands of their compatriots with the click of a mouse. The Internet has provided the far-right fringe with formerly inconceivable opportunities. Online, racists, anti-Semites, and anti-government extremists can reach a much larger audience than ever before and can more easily portray themselves as legitimate. When uninformed or easily influenced people come across hate propaganda, they can fall prey to its deceptive reasoning and adopt hateful beliefs themselves, sometimes going so far as to act on what they have read.
The internet supports new forms of far right mobilisation. A clear example is the Immortal group in Germany, which emerged in 2011. Organised around Twitter and other social media, it stages unregistered rallies at night, at which its activists wear white masks, carry torches through urban areas and chant extremist slogans. Shortly after each gathering, a professionally produced video appears on YouTube. Intended to demonstrate the group’s power and support, the imagery harks back to the torchlight processions of inter-war Nazism.
Activists have embraced the internet to such an extent that it’s now virtually impossible to track all the bloggers, Twitter accounts and Facebook pages that have, for them, become indispensable tools of communication. These virtual communities are offering their member a set of shared values, norms, meanings, and a sense of history. In February 2012, Wilders’s party in the Netherlands launched a Web site targeting Polish immigrants, who had come by the thousands as the Netherlands opened its door to more workers from poorer parts of the European Union in the mid-2000s. The site invited Dutch citizens to report Eastern Europeans for doing anything from “taking your parking spaces” to “taking your jobs.”
Develop a strategic plan for the prevention and response to racist attacks, in cooperation with specialized global and European organizations.
Create a special force within the European Union Member States that would liaise with Law Enforcement Agencies.
Identify special programs for the training of police officers, in the framework of the EU in cooperation with national institutions.
Cooperate with experts who monitor the activities of extremist groups so as to ensure a more comprehensive response.
Internet service providers need to be as effective at removing material that promotes violent extremism as they are in removing content that is sexual or breaches copyright.
© OSINFO Analysis
Cissé And Girlfriend Racially Abused Online (UK)
The footballer and lover Rachelle Graham were subjected to abuse after raising money for his charity
22/4/2013- A Premier League footballer and his beauty queen girlfriend have been targeted by racists – because he is black and she is white. Newcastle United striker Papiss Cisse, 27, and 22-year-old Rachelle Graham found themselves subjected to vile abuse after raising money for his charity. Rachelle, the current Miss Newcastle, took part in a skydive from 10,000ft to help fund an ambulance to help people in Cisse’s native Senegal. The event was attacked on a website set up by bigots who are against mixed-race relationships. A bogus Twitter account was then set up in Rachelle’s name, and friends and wives of other Toon stars, including midfielder Yohan Cabaye, also started to receive racist messages.
Now detectives are investigating the abuse as a hate crime after being called in by Rachelle’s shocked mum Val, 46. The Mirror has also passed on some of the disturbing messages to Durham police. Cafe owner Rachelle said: “At first the messages came from a US website which targets white people who are going out with black people. The person behind that may also have set up the fake Twitter account in my name. It was made to look as much like mine as possible, using my photos and virtually the same address. “The first I knew of it was when I started to get messages from friends responding to tweets I had not sent. This person was sending out abuse and saying I was no longer with Papiss, and would use racist terms. “I am totally sickened by it. I cannot believe someone that would do this.”
The cloned account, @grahamracheLLe, has now been suspended. The real one, @GrahamRachelle, has 1,500 followers. The abuse began last month after Rachelle, of Edmundbyers, Co Durham, helped raise £900 for Cisse’s charity, Friends of Sedhiou. Rachelle added: “They put up pictures of monkeys with knives, and would change newspaper stories on him, using terms like ‘imported from Senegal’.” “Papiss has seen this kind of racism before, and said we should ignore it. But I think it’s not on and we have to do something to stop it.” A police spokesman said: “The website has been reported and we are investigating.”
© The Mirror
Anti-Semitic attacks ‘diminish dignity’ (South Africa)
16/4/2013- The South African Jewish Board of Deputies has filed a hate speech complaint in the Equality Court against Snowy Smith, the Durban man they accuse of bombarding their members with e-mails, CDs and DVDs referring to “Zionist Jews” as the enemy and orchestrators of wars. The board is calling for an unconditional apology from Smith and for him to be restrained from disseminating further alleged “hate speech” against Jewish people. He has threatened to sue the board for R10 million for defamation. Smith, who described himself in his e-mail signature as a senior complaints investigator for Fair Civil Law, a company that investigates and exposes conspiracy theories and propaganda lies, said yesterday that he had not received the court documents and denied being anti-Semitic.
The board’s national spokeswoman, Mary Kluk, said most of the e-mail recipients lived in Cape Town and that the messages were unsolicited. “Our members have been bombarded with these e-mails, and the CDs and DVDs containing this hate speech also made their way, in bulk, into the public domain,” she said. “At the moment the court matter is up in the air because it has been a battle to find Snowy Smith to serve him with the summons to appear in the Cape Town Equality Court.”
The board is an umbrella organisation of the South African Jewish community and its aim is to protect the civil liberties of Jewish people and to combat racism, including anti-Semitism. Its lawyer, Tzvi Brivik of Malcolm Lyons & Brivik Incorporated in Cape Town, said the court’s registrar had served the complaint and summons via registered mail to Smith, but it was unclear whether he had received it. Brivik said they were working with the court registrar to try to get a fixed address for Smith. The matter had been set down for the middle of next month when the court would consider whether it should be heard in Durban or Cape Town. In its court papers, the board said Smith was well known to it for his propagation of anti-Semitism and expression of “anti-Jewish” hate speech.
The complaint refers to e-mails sent from October 2010 to May last year. The board contends that the e-mails fell within the exceptions of one’s right to freedom of expression as they amounted to the “advocacy of hatred that is based on race, ethnicity, gender or religion which constitutes incitement to cause harm”. The target, the board alleges, were people of the Jewish faith, a small minority in South Africa. The e-mail recipients included a Durban attorney, a Cape Town news editor, a Cape Town teacher and a prominent South African businessman and former high court judge, all of whom are Jewish.
The e-mail content included:
* “The enemy is Zionist Jew, freemasons, communism, fascist, military dictatorship new world order. Most Zionist Jews are freemasons.”
* “The Zionist Jew NWO (new world order) orchestrated and systematically planned the downfall of every White Owned, White Run Country in Africa, including South Africa using communism.”
* “The Zionist Jew NWO orchestrated and systematically planned the last World War and almost every war and or major revolution in history.”
* “There is only one group of terrorists in the world today and they are Zionist Chabad Freemasons, Illuminati Jews and the Billionaire Wall Street Bankers. We know exactly where they are.”
The court papers referred to all of the e-mail contents – “every idea and statement expressed” – as being an example of hate speech. The board said Smith’s statements diminished their self-worth and dignity as an individual and group. Other effects of the e-mails noted in court papers included feelings of being “implicitly threatened” and “psychologically harmed”. The board also felt that the strong sense of “ill will” would be instilled in the public against Jewish people should they be exposed to those e-mails. However, Smith has denied being anti-Semitic. “I am pro-Semitic. I love and support the Palestinian people and the Arabs who are ‘Semites’,” he said.
© IOL News
Jewish group sues Twitter in France again over anti-Semitism
15/4/2013- A French Jewish group that sued Twitter for hosting anti-Semitic content lodged a fresh complaint against the social networking service. The latest complaint by the Union of Jewish Students of France, or UEJF, was filed on April 12 with the Paris Public Prosecutor’s office against Twitter CEO Dick Costolo. UEJF and another group, J’ACCUSE, said in the complaint that Costolo was “responsible for racial defamation and publicly inciting to discrimination, hate or violence toward Jews.” The complaint concerns tweets that call for killing Jews and praising the Holocaust. UEJF last month sued Twitter for $50 million after the California-based company failed to honor a ruling in January by a French judge ordering it to divulge within 14 days details of users who posted anti-Semitic statements.
France and other European countries have laws against hate speech that are considerably stricter than in the United States. In its ruling, the Paris court also ordered Twitter to set up a system for flagging and removing such messages, but UEJF said Twitter has not complied. Additionally, UEJF accused Twitter of lying when it reportedly announced last October that it would remove similar tweets. The tweets are still available to users who do not self-identify as being French, UEJF said. Despite the tweets still being available, Le Nouvel Observateur reported in October that Twitter said it had removed the tweets, which had spurred public condemnation. The phrase #UnBonJuif (meaning “AGoodJew”) became the third most popular hashtag on French Twitter due to what Le Monde termed “a competition of anti-Semitic jokes” that evolved around it. Twitter did not respond to JTA's request for comment on the latest UEJF complaint.
© JTA News
ADC continues the battle against on line hate (Australia)
The B’nai B’rith Anti-Defamation Commission has commented on fresh media allegations of growing racism in Australia.
10/4/2013- Chairman Dr Dvir Abramovich said: “In recent weeks there have numerous media pronouncements about the prevalence of racism in Australian society. Prominent amongst these were articles by Fairfax columnists Waleed Aly and Tim Soutphommasane.” While levels of racism, both actual and comparative, are debatable, it is clear that societies in flux such as Australia are vulnerable to the stress of change, a manifestation of which may be racism. There have been a number of recent examples of racist assaults that have received wide publicity including hate-filled rants on public transport in Melbourne and Sydney. And while these were extremely distressing for the victims, what was perhaps even more so was the passivity of other passengers.
I recently have spoken out against the use of social media, Facebook in particular, to propagate hatred of those who are not like ‘us’. Pernicious though the use of media to defame may be, this pales in comparison to the face to face abuse of others. This invasion of one’s personal space by hateful invective is both very frightening and humiliating.
According to Tim Soutphommasane (Racism is like a cockroach of civilised society, The Age 8/4), racism’s real harm is that it allows people “to believe they are empowered to harass, belittle and intimidate others”. Mr Soutphommasane must have grown up in a very lucky country to hold this remarkably benign view of its effects. Quite simply the truly evil effect of racism is that it allows and excuses the genocide of those who look or think differently to ‘us’. Hence I would differ with Mr Soutphommasane’s argument that it is not obligatory “to put ourselves in harm’s way in solidarity with a fellow citizen or person in need”. When the stakes are as high as saving human lives, there are no excuses to be a bystander.”
Four race-crime convictions for neo-Nazi website (Italy)
Stormfront targeted figures who spoke up for immigrants
8/4/2013- Four men were convicted of inciting race hatred Monday from the Italian website of the neo-Nazi group Stormfront. The four, aged 23 to 42 and from various towns across Italy, were sentenced to terms ranging from 30 months to three years for "promoting and directing a group whose purpose was the instigation to ethnic, religious and racial discrimination and violence". A Rome judge found the four guilty of targeting "Jews and immigrants, advocating the supremacy of the white race and instigating racism and Holocaust-denial". The four, who were placed under house arrest Monday, were arrested November 16 after police shut down the website, which had regularly posted anti-Semitic and white supremacist propaganda. The arrests were made in Milan, Frosinone and Pescara while 17 searches were carried out across the country, most in the north east. 'Nazi-Fascist' propaganda and weapons were found.
In December 2011 Rome prosecutors launched a probe into Stormfront's blacklisting of religious figures, politicians, journalists and judges. The Italian branch of Stormfront is part of an international body founded by the former head of the Ku Klux Klan, Don Black. The blacklist included: Turin Archbishop Cesare Nosiglia; Riccardo Pacifici, the President of the Jewish Community in Rome; Adel Smith, the President of the Muslim Union of Italy; the mayor of Padua, Flavio Zanonato; several members of the judiciary; and journalists Gad Lerner, a Jew, and veteran TV talk-show host Maurizio Costanzo. According to media reports, those on the list were targeted because of their support for immigrants. Also listed were then House Speaker Gianfranco Fini and then Minister for International Cooperation and Integration Andrea Riccardi, who have both spoken out about citizenship rights for immigrant children. They were on a list of so-called 'Italian delinquents' drawn up by the neo-Nazi group.
© Gazzetta Del Sud
Fascists target Sunderland muslims on Facebook (UK)
8/4/2013- Anti-fascists claim Far Right extremists are establishing roots in Sunderland and spreading racial hatred across the city via Facebook. Activists from Hope not Hate claim a Wearsider using the alias “Angel United Patriots” is playing a central role in organising Far Right demonstrations against Muslims living in the city. The organisation also claims the user behind “Angel” was among crowds of English Defence League (EDL) supporters chanting anti-Muslim slogans at last month’s demo against proposals for a new mosque of St Mark’s Road in Millfield. Online postings seen by the Echo suggest “Angel” is a primary mover in getting Far Right activities “up and running in Sunderland”. Comments have also been left in which Muslims in the city are described as “Muzrats” and “Muzzies”. One post says: “Just had a Muzrat in shop asking why I don’t have any clothes for her”.
On their websites both Hope not Hate and the Tyne and Wear Anti Fascist Association (TWAFA) have condemned the racist comments, branding them “disgusting and derogatory”. Concerns about the establishement of Far Right extremism in the city has grown in recent months following a series of protests over the citing of the new mosque. Three people were arrested during last month’s protest which saw up to 80 EDL demonstrators clash with anti-fascist organisations. A spokesman for the Tyne and Wear Anti-Fascist Association said: “We know there is a Sunderland EDL running in the city, but we know other groups are also getting involved with them. “This is a network that is starting to establish itself and the issue with the mosque has given them an excuse to voice their views.”
Hope not Hate says it has been monitoring the activities of extremists in Sunderland over recent months and have become increasingly concerned with the postings made by “Angel”. The Echo understands Far Right groups have also tried to distribute thousands of leaflets across the city in a bid to recruit more members.
© The Sunderland Echo
UK Neo-Nazi Darren Clifft Arrested after KKK-Style Mock Hanging at Rally
A notorious British neo-Nazi has been arrested on suspicion of inciting racial hatred after he was accused of posting racist and inflammatory material on the internet.
7/4/2013- Darren Clifft, 23, from Walsall, West Midlands is believed to have been one of the ringleaders behind last month's far-right rally in Swansea, when around 50 white supremacists were confronted by a crowd of around 500 anti-racism campaigners. Clifft, who also goes by the name Daz Christopher, is known to police having previously voiced support for the Norwegian mass-murderer Anders Breivik, who killed 77 people in a bombing and shooting spree in Norway in 2011. Breivik was jailed for 21 years last August, but used his trial as a platform to promote his extreme anti-Islamic, anti-feminist and anti-Marxist views Cliift, a kickboxing fanatic who has coached youngsters in the sport, has threatened similar violence in the UK, and set up an online petition to free Breivik. "At least Breivik achieved something with his life," wrote Clifft. "He killed as many left-wing liberal loonies as he could. He is truly an inspiration. He sacrificed his life so that Europe might be free once again from the clutches of Islam."
Pictures from the Swansea March which were later posted on the internet showed a neo-Nazi rabble dressed in the white hooded robes of the Ku Klux Klan, carrying out a mock hanging of a black cloth doll. The event, organised by the National Front, was followed by a bizarre rock concert organised by Blood and Honour, a Nazi-inspired group which takes its name from the motto of the Hitler Youth.
'A menace to society and a very real danger'
Other known white extremists including Shane Calvert, of Blackburn, and Michael Kearns, from Liverpool, along with locals David and Bryan Powell and Luke Pippen, were also at the event, according to anti-racism campaigners Hope Not Hate. Police say Clifft has now been bailed pending further inquiries by officers from the West Midlands Counter terrorism Unit. Matt Collins, of Hope Not Hate, said: "Not only is Darren Clifft a menace to society, he presents a very real danger to any young people he still works with. "It's time the police took action against these kind of nutters."
© The International Business Times - UK
Facebook Removes Anti-Semitic Page With Graphic Holocaust Images (Australia)
Photo of Mass Jewish Grave Is Labeled 'Lazy Jews'
7/4/2013- Facebook has removed a page featuring anti-Jewish content, including a photo of murdered Jews in a mass grave at Bergen-Belsen with the headline “Lazy Jews.” In response to a formal complaint by the Executive Council of Australian Jewry, Mia Garlick, Facebook’s Sydney-based manager of communications and policy, confirmed Saturday that the Jewish memes page was no longer available. Peter Wertheim, executive director of the Executive Council of Australian Jewry, welcomed the removal and said that Facebook’s decision about another 15 such pages about which the council has complained is pending. “The Jewish memes page not only represents a slur on all Jews, as Jews, but also subtly seeks to justify and therefore incite violence against them,” Wertheim wrote. “Many of the posted comments lay bare the hateful purpose of this page.” Wertheim added that the Facebook page violates both Australian law though the Racial Discrimination Act, and Facebook’s Terms of Service against hate speech.
© The Forward
Man who called Jews ‘scum’ back in jail (UK)
A Welsh man who was jailed for encouraging racial hatred is back in prison after being given a 12 month sentence for harassment.
5/4/2013- Trevor Hannington, who was associated with the far-right Aryan Strike Force, was jailed for two years under the Terrorism Act in 2010. The neo-Nazi had described Jews as “scum” that should be “destroyed”. At the time, Liverpool Crown Court heard that he used a website to encourage others to: "Kill the Jew, Kill the Jew, Burn down a synagogue today ... Burn the scum." Mr Hannington admitted owning the Anarchist's Cookbook and The Terrorist Encyclopedia, and admitted publishing instructions on how to make a flame thrower out of a water pistol. According to a report on the Mail Online, Lindsay Seagrim-Trinder, a potential witness in the original case, had told the police about Mr Hannington’s far-right views. He called her a “grass” and after his release from prison published her details online as part of a hate campaign. On Wednesday, Mr Hannington pleaded guilty to putting a person in fear of violence by harassment at Merthyr Crown Court.
© The Jewish Chronicle
Facebook's Online Speech Rules Keep Users On A Tight Leash
3/4/2013- Corporations may have more control over online speech today than the courts. Executives determine which videos, pictures and comments are permitted and what art is allowed. Their rules govern billions of posts across the globe each day. And those policies differ dramatically across Silicon Valley's big social platforms. Twitter calls itself the free speech wing of the free speech party and models its approach on the U.S. Constitution. Facebook's rules, however, more closely resemble workplace policies: the kinds of rules that govern what you can say to colleagues at lunch. But unlike an employer's rules, Facebook's are applied on a global scale. Judd Hoffman, Facebook's global policy manager, is a trim 40-something. Well-spoken, thoughtful and a trained lawyer, he doesn't give off the vibe of a global power broker — but in a very real sense, he is one. "Our job is to manage the rules that determine what content is unacceptable on Facebook and also, obviously, what is acceptable," Hoffman says. His team determines what more than 1 billion people and businesses can and can't say and do on Facebook.
Headlines March 2013
To give you a sense of the scale of this job, consider this: There are more than 300 million photos, millions of video links and 2.5 billion messages of one kind or another uploaded to Facebook every day. Hoffman's team has to police it all. And Facebook that, in public, is protected by the First Amendment — things like nudity, hate speech, bullying and pornography. And the company doesn't use algorithms to seek out bare flesh or other violations, Hoffman says. Instead, Facebook relies on its billion-plus members to flag content that violates its community standards. It then uses algorithms to sort the different kinds of possible violations into digital piles. Then, hundreds of real people — employees around the globe — sift through these piles of status updates, photos and videos to decide which speech and images to allow and which to block.
More Power 'Than Any King Or President'
Jeffrey Rosen, a law professor at George Washington University, says social media companies have a tremendous role in dictating online speech. Facebook, Google and Twitter "have more power over who can speak and what can be said all across the globe than any king or president or Supreme Court justice," he says. "But unlike presidents, Facebook is not constrained by the Constitution," Rosen adds. "The First Amendment only binds the government — not private corporations." Generally, anonymous speech is forbidden on Facebook. When you post a comment or picture, upload a video or click a "like" button, there's social accountability; your friends know what you are up to. But there is one place where it is possible to speak on Facebook without attaching your name to what you say: Corporations have these pages, as do community groups.
If you are a fan of cats or off-color jokes, it's possible to create a Facebook community page dedicated to your passion — and you can do it without revealing who you are. This created a problem for Hoffman, Facebook's global policy manager. Lots of pages started cropping up. "You know, fat jokes and Polish jokes, or all those things," Hoffman says. "Because they are humor, they don't necessarily violate our terms — but can be hurtful in a lot of contexts." Facebook's ban on hate speech has , and on these community pages, which are more or less anonymous, some people took advantage. "There was a very distasteful meme in the form of an illustrated joke about Anne Frank," says Christopher Wolf, chairman of the National Civil Rights committee for the Anti-Defamation League. "To any Holocaust survivor or anyone concerned with anti-Semitism, it was quite offensive." There were also pages dedicated to mocking the disabled, gay jokes, racial slurs and content that tiptoed up to the line when it came to promoting violence.
At Facebook, Hoffman realized community pages were becoming a haven for Internet trolls, so his team decided to shine a bright light under a bridge. When a page was flagged as problematic, Facebook began requiring that the administrators of these pages identify themselves publicly. "In the vast majority — vast, vast majority of cases where people are asked to do that — they chose not to, because it requires them to take responsibility for that kind of content," Hoffman says. When administrators choose not to comply, the pages come down. Wolf applauds that move. "I think today, Facebook takes a very sophisticated and sensitive approach to handling the issue of hate speech," he says.
A Chilling Effect?
But law professor Rosen has reservations. He says Facebook isn't merely banning hate speech. What it's really doing, he says, is judging what is and isn't offensive — and all of this is based on community norms. If Facebook had existed in the 1970s, Rosen says, rules like these could have easily made organizing around, say, gay rights difficult or impossible. He says by definition, transgressive movements, at their founding, are going to offend people. "Those sort of protests often express themselves with jokes, with bad taste, and they depend on anonymity," Rosen says. "It's impossible to imagine the gay-rights movement without anonymity." So while Rosen acknowledges that Facebook has every right to determine for itself what speech to allow and what to ban, he hopes that the company will preserve at least the possibility for anonymous actors to say politically controversial — even occasionally offensive — things online.
Spreading Racism via Facebook
Heavy Facebook users are more likely than those who log on occasionally to react positively to racist remarks.
By Tom Jacobs
29/3/2013- Is Facebook a particularly powerful medium to spread racist messages? That’s the disturbing implication of a newly published study. “Frequent users are particularly disposed to be influenced by negative racial messages,” psychologists Shannon Rauch and Kimberley Schanz write in the journal Computers in Human Behavior. They argue these heavy users log onto the site in search of social inclusion rather than information—and as such, they’re prone to express agreement with the material they see without thinking about it too deeply. This combination of “a need to connect and an ethos of shallow processing” creates an atmosphere conducive to the spread of racist thoughts. Rauch and Schanz describe a study featuring 623 Internet users, nearly 95 percent of whom had a current Facebook account. They were asked how often they checked the site, reporting their typical usage on an eight-point scale from “less than once a week” to “20 or more times per day.” They then read one of three versions of a Facebook Notes Page, which was purportedly written by a 26-year-old white male named Jack Brown.
One version contained what the researchers describe as a “superiority message,” a post in which Jack “contrasted the behaviors of black and white individuals, only to find consistent superiority of the whites.” The second contained a “victim message,” a post in which Jack argues that “whites are the most oppressed racial group in America.” The third contained an “egalitarian message,” a post in which Jack gives examples of anti-black racism he has witnessed. The study participants were asked, among other things, “how much they agreed with the message,” “how accurate they found it,” “how much they liked the writer,” and how likely they were to either share the post with others or argue against it. The researchers found more-frequent Facebook users did not differ from the others in their reaction to the egalitarian message. However, they “were more positive toward the messages with racist content—particularly the superiority message,” they write.
Why would this be? “Frequent Facebook users are likely susceptible to negative persuasive messages because they engage in less critical processing, either because of their online experiences or personality traits,” Rauch and Schanz write. “Agreement and positive attitudes are driven by a need to belong and connect with others.” They note that, compared to those who use the site primarily for entertainment or “connecting with others,” the minority of Facebook users who report they use the site to find information and express their opinions were more likely to reject the racist messages. This group “appeared to discriminate between messages” to a far greater degree than the others. “This is a sobering finding, given that Facebook use has become increasingly commonplace, and … information-seeking is not a primary motivation of most Facebook users,” Rauch and Schanz conclude.
“Facebook clearly has diverse content,” they note, “which can include persuasive messages of a sort that warrant critical thinking and some depth of processing.” But critical thinking is often absent when people are motivated by the desire to be accepted, or to be entertained. As a result, this study suggests, some pretty disturbing stuff is being received with uncritical acceptance.
© Pacific Standard Magazine
Hacker in huge Web attack makes anti-Jewish statements
A man claiming to represent the hackers behind one of the biggest attacks in Internet history made anti-Jewish statements.
29/3/2013- Sven Olaf Kamphuis, who claims to be a spokesman for the group that has slowed down Web access in retaliation for Internet providers that refuse to prevent their clients from spamming, made anti-Semitic statements on Facebook, the Daily Beast reported. "There are a certain group of Jews known as the Zionists that think they are better than other people and this is not a problem with all Jews, this is just a problem with certain Jews who think the others are like the goyim," Kamphuis said in an interview. "I think Steve Linford is like that.” Linford is the founder of Spamhaus, a Geneva-based anti-spam company. According to Kamphuis, Spamhaus placed an Internet Service Provider that he owns on a blacklist, effectively denying it access to the Internet. In retaliation, Kamphuis says he and others formed an opposition group, Stophaus, which earlier this month launched a massive attack that flooded servers with data, impeding Internet service for users around the world. It is said to have been the largest such attack in history. Kamphuis has a history of making claims against Israelis. In 2010, after a German court issued an injunction against his company for piracy, he claimed the Mossad spy agency tried to blow up his car.
© JTA News
Yeah, We Broke the Internet: The Inside Story of the Biggest Attack Ever
Sven Olaf Kamphuis says his group was behind this week’s massive Internet attack. He also says it was completely justified—and kind of blamed the Jews. Eli Lake on the Web’s weirdest war.
28/3/2013- What is becoming clear is that the attack is an outgrowth of a little-known, but highly explosive war between two factions: on one side are the Internet service providers (ISPs) and Web hosts that don’t ask their clients too many questions about whether they are hosting spam and other kinds of malicious code; on the other are groups that try to name and shame the spammers, and stop them from infiltrating your inbox—or worse, your bank’s servers. This side is engaged in a massive game of virtual whack-a-mole, only one with no end in sight.
In this latest retaliatory attack, the spammers got the better of their opponents, shutting down servers and slowing down the entire Internet. One man so far has come forward, claiming to be the spokesman for the attackers—a man named Sven Olaf Kamphuis. A so-called Internet activist, Kamphuis disdains government regulation of the Internet and, at least according to his Facebook page, gays and Jews. In an interview with The Daily Beast, Kamphuis said he owns an ISP that was put on a blacklist by the Geneva-based anti-spam company Spamhaus. Companies on the blacklist are blocked by email providers and other Internet service companies, which means they’re essentially kicked off the Internet.
So Kamphuis and others on the blacklist formed an opposition group, Stophaus, and earlier this month, they launched the most powerful “distributed denial of service” (DDoS) attack in the history of the Internet. DDoS attacks flood a server with data—in this case, 300 billion bits of data per second—at a rate it can’t possibly handle, thereby shutting it down. Stophaus’s onslaught overwhelmed not just Spamhaus’s servers, but the rest of the Internet, too. Thus, Netflix users around the world were suddenly wondering why they couldn’t stream You’ve Got Mail. “There are a lot of people who are really pissed off about this,” Kamphuis said of Spamhaus. “And we are the first to show some balls and do something about it.”
Kamphuis said he himself had nothing to do with DDoS attacks. “I am a spokesman for Stophaus,” he said. “But being in the Internet industry I cannot have anything to do with these attacks.” Kamphuis said his group decided to stop the attacks on Tuesday, but said there are other hackers, and possibly even governments, who would like to continue the assault. On his Facebook page, Kamphuis is adamant about his hatred of Spamhaus, posting on Wednesday that the company “took down members of the stophaus.com group—first—and without any court verdict, just by blackmail of suppliers and jew lies.” He went on to call for an end to SMTP, the Internet protocol for sending and receiving emails, saying it gives “fags an excuse to nag.” “There are a certain group of Jews known as the Zionists that think they are better than other people.”
In an interview, Kamphuis, who says he is from Amsterdam and lives in Barcelona, said the comment about “jew lies” was referring to Steve Linford, the founder of Spamhaus. “This is a reflection on Steve Linford,” Kamphuis said. “He is always nagging people. There are a certain group of Jews known as the Zionists that think they are better than other people and this is not a problem with all Jews, this is just a problem with certain Jews who think the others are like the goyim. I think Steve Linford is like that.” Kamphuis said his politics are largely libertarian. “If I was in the United States, I would be Republican without all the Christian blah blah around it,” he said. “I want a minimal state with not much taxes, that all belongs to companies. I don’t like Israel though.”
Erik Bais, the owner of A2B-Internet, an ISP based in the Netherlands, described Kamphuis as a brilliant programmer who doesn’t care what others think of him. “Jewish people are not high on his favorite list,” he said. The Daily Beast's Eli Lake on on the origin of the biggest internet attack ever. Kamphuis himself made his reputation when in 2010 he attempted to host Pirate Bay, a website that was targeted by the Swedish authorities for copyright infringement. A German court placed an injunction against Kamphuis and his ISP company, CB3ROB, to stop him from putting Pirate Bay back online. Kamphuis said at the time that the Mossad, the Israeli intelligence service, attempted to blow up his car as a warning. “My car did not decide to explode on its own,” he said.
Bais got to know Kamphuis in the tight community of Dutch Internet technical specialists, but began communicating with him more closely after they discovered they both had a mutual enemy in Spamhaus in 2011. At the time, Bais’s company provided network services to a data center that provided services for CB3ROB, which in turn provided services for Cyberbunker, a Web host company Bais acknowledged may have hosted some shady websites. Nonetheless, he said, Cyberbunker did not traffic in spam. When Spamhaus put Cyberbunker’s Internet protocol (IP) address on it blacklist, it also listed 4,000 other IP addresses that Bais said had nothing to do with Cyberbunker or CB3ROB. “This was extortion,” Bais said. “As soon as we dropped the IP addresses for CB3ROB, Spamhaus immediately dropped the other addresses which had nothing to do with spam.” In the meantime, Bais found many of his customers could not use email because of the blacklist.
Adam Wosotowsky, a threat researcher at the Internet security firm McAfee, said Spamhaus had a good reputation in the cyber-security world. “Spamhaus historically is not known for making knee-jerk emotional decisions,” he said. “Generally, Spamhaus tends to be very straightforward as to why they are blocking things. They are not in the business of causing false positives.” Phone calls and emails to Spamhaus were not answered. Cyberbunker couldn’t be reached for comment.
© The Daily Beast
Flintshire school pupils suspended for violence, sexual harassment and racism, report reveals (UK)
Hundreds of Flintshire schoolchildren are being suspended for sexual harassment, racist behaviour and violence toward teachers and their classmates every year.
28/3/2013- A council document obtained by the Chronicle also shows more than 1,000 pupils have been excluded for offences also including possession or use of weapons, threatening and dangerous behaviour, bullying and theft in the past three years. From September 2009-July 2012, 181 primary pupils and 894 high school students and were temporarily suspended – with nine kicked out permanently.
The number of ‘fixed-term’ exclusions across the board included:
424 for violence towards pupils
282 for offences of violence against members of staff
112 for dangerous or threatening behaviour
54 for bullying
32 for substance misuse
11 for carrying a weapon
11 for sexual harassment
nine for racism.
Teaching unions said the statistics were ‘clearly worrying’ – and claimed poor parenting plays a part in children’s bad behaviour. Suzanne Nantcurvis – NASUWT’s national executive member for North Wales – added: “There is often a lack of helpful parental intervention.” And Liz Camino, Flintshire secretary for the National Union of Teachers, said social media websites – such as Facebook and Twitter – ‘have a lot to answer for’. The longest primary school exclusion was 30 days – for violence toward a member of staff in 2010-11. The figures show there was an average of 13,286 pupils in Flintshire’s junior schools over the three years, and a third of the county’s 75 primaries suspended children. Over the same period all 12 Flintshire high schools excluded students – with one suspending 80 in 2011/12. The lengths ranged from half a day for disruptive behaviour to 41 days for violence against a fellow pupil.
Other lengthy sanctions included 38 days for violence toward a member of staff, 26 days for substance misuse and 20 days for sexual harassment. Pupils were excluded for defiance (276 overall), verbal abuse (241), disruptive behaviour (127) and damage to property (28). The report prepared for Flintshire County Council’s lifelong learning overview and scrutiny committee said: “There is an ongoing concern about the number of pupils in some of our high schools who are receiving multiple fixed term exclusions in an academic year. “It is these pupils who are causing the majority of disruption.” Mrs Camino said the social networking sites cause a lot of problems – particularly in high schools – and told the Chronicle incidents can often be traced back to ‘something that happened on Facebook two or three days ago’. “You can have children come in who’ve spent half their night on Facebook and they’re not fit to be in school,” she said. “The amount of cyber-bullying that goes on is horrific. Social networking has a lot to answer for.”
Mrs Nantcurvis added: “The school becomes the focus for a lot of problems – things spill over.” The report says the number of pupils kicked out for good has dropped from 24 in 2003/4 to one last year. Mrs Nantcurvis said expulsions were usually reserved for ‘extreme circumstances... a last resort’. Mrs Camino claimed children arriving at school ‘not learning ready’ because of ‘deficiencies in parenting’ also causes discipline problems. She said teachers can often find themselves fighting ‘a losing battle’, and ‘a lot of children are on to a loser from the start’. “It’s certainly not the fault of the schools, they are doing all the right things,” she added. “In the scheme of things we’re talking about small numbers, and all credit to the schools in Flintshire that they are keeping so many in mainstream education. “Schools, considering the pressures on them, come up with the goods to provide a good education despite the expectations on teachers changing almost term by term.”
© The Flintshire Chronicle
Here's How Far-Right Extremists Recruit on Twitter
29/3/2013- It’s not hard to find extremists on the internet. But it’s really hard finding out who’s the most successful at spreading extremism, which can make counteracting their influence difficult. Now a pair of researchers think they’ve figured out how to do it — which could make extremist threats easier to identify and block. The researchers also discovered some peculiar data about how extremists on both the far right and left use Twitter and how online extremist networks are organized. In a new report, terrorism analyst J.M. Berger his co-author Bill Strathearn found that traditional leaders on the far right are losing influence to new forms of extremist media, spread online by a small group of influential activists who are relative unknowns, but can communicate to a much larger audience of potential recruits. These activists are even attempting to make inroads into mainstream politics.
The team began by collecting 12 Twitter accounts owned by prominent self-identified white supremacists with a combined total of 3,542 Twitter followers. These accounts were for groups and individuals such as the white supremacist ideologue David Duke, various Ku Klux Klan factions, and neo-Nazi clubs like the Aryan Nations, American Nazi Party and the American Freedom Party. Next, the team narrowed in on the followers, of which 44 percent espoused what the team considered explicitly white supremacist views. The team looked at which of those followers were interacting with others the most and who had the most influence (meaning their tweets were retweeted by others the most). And finally, which websites were they linking to? And which hashtags were most popular?
The team concluded: The most influential ideologues were highly influential among the group, and most were dabblers in a kind of 90-9-1 rule for internet skinheads: 90 percent are lurkers and rarely contribute, 9 percent contribute some of the time, and 1 percent do most of the talking and effectively control the conversation. A full list of the most influential are included in the authors’ report (.pdf), published by the International Centre for the Study of Radicalisation and Political Violence, a London think tank. It might sound obvious, but that’s good news. “In short, the vast majority of people taking part in extremist talk online are unimportant,” the authors write in the report. “They are casually involved, dabbling in extremism, and their rhetoric has a relatively minimal relationship to the spread of pernicious ideologies and their eventual metastasization into real-world violence.”
The most prominent white supremacist leaders also suck at promoting themselves. Instead, their followers preferred to link to other websites like WhiteResister, Infowars and the white supremacist Council of Conservative Citizens. This suggests the old guard of American organized racism is “not generating daily buzz, on Twitter at least,” and that “these well-known leaders of white nationalism in the United States may be losing touch with their constituents.” Here’s the bad news. The most influential Twitter followers among the sample are “highly committed white nationalists unlikely to be swayed by intervention.” Influential users are also “actively seeking dialogue with conservatives” through hashtags #tcot (or top conservatives on Twitter), #teaparty and #gop, as well as frequently linking to mainstream conservative websites. But only 4 percent of users identified as mainstream conservatives, which suggests the hashtags “are driven more by white nationalists feeling an affinity for conservatism than by conservatives feeling an affinity for white nationalism.”
This affinity can be prevented in part, according to the authors, with several tactics. One, they claim their metrics can be used to identify casual followers, “whose interactions indicate an interest in an extremist ideology but not a single-minded obsession with it.” Anti-racist activists — and mainstream conservatives in particular — could perhaps focus on these fence-sitters, keep tabs on them, and try to pull them away from becoming radicalized. Blocking content is harder and exists without clear guidelines, but many services like Twitter have begun blocking some neo-Nazi content, though it’s an uphill battle. Identifying the most diehard users first, and then blocking them, could cause the rest to whither away. Another option, the researchers suggest, is more targeted blocking of certain YouTube videos, which feed extremists’ Twitter feeds.
The authors encountered a similar phenomenon when studying left-wing anarchists, though there are some differences. Anarchists on Twitter still had a small number of highly influential users in control of the conversation, but the distribution wasn’t as sharply concentrated as the white supremacists. Anarchists on Twitter are less likely to identify as strongly with anarchism as white supremacists do with their ideology, and are more likely to identify with multiple political ideologies — meaning other extremist ones. They’re also more likely to rely on mainstream websites for information, and the websites they prefer are mostly politically liberal.
But the anarchists’ hashtags show little signs of interest in mainstream liberals as potential allies. (Their hashtags are mostly Occupy Wall Street-related.) This may reflect an ideological difference. Anarchists are “fundamentally opposed to political institutions, compared to white nationalism, which is not opposed to institutions per se,” the authors write. Nonetheless, however much the difference between extremist ideologies might seem, their behavior is still pretty similar across these boundaries. Breaking their spell has to first begin with finding out who’s who.
Rechtes Land: the online map that is tracking Nazis in Germany
It shows the location of Nazi groups, their activities and crimes in an attempt to counter the rise of the far-right in Germany
27/3/2013- At first glance it looks like any other internet map, conveniently showing a smattering of cafes, nightclubs and bookshops. But Rechtes Land (Right Country) doesn't show the usual places of interest – it shows Nazis. Launched earlier this month following public donations of €6,000, www.rechtesland.de is the latest attempt to stem what is seen by some as growing far-right activity in Germany. Last year 900 armed police stormed 150 neo-Nazi premises in North Rhine-Westphalia, and the country has been rocked by allegations that a neo-Nazi cell in the eastern city of Zwickau called the National Socialist Underground committed 10 murders between 2000 and 2007. According to a study last year by the Friedrich Ebert Foundation, a German political foundation associated with the Social Democratic Party of Germany, 9% of all Germans hold far-right views, up from 8.2% two years earlier.
The map, which shows the locations of extreme-right groups, their associations, murders, attacks and current projects, was the idea of data journalist Lorenz Matzat, although the raw information comes from a collaboration with Apabiz, a Berlin-based nonprofit organisation that runs one of the country's most extensive archives on neo-Nazi activity. In the first two days of operation, Rechtes Land was visited by 48,000 users. Soon to be augmented with Wikipedia-style detail, the online map will include historical information from the second world war, memorial sites and everything people want to know, past and present, about extremism where they live. The data already shows 120 marches by fascist organisations in Germany in 2012 alone, says Matzat. "I decided we should use the web to show how much fascism is still alive in Germany," he says from his office in Berlin's Kreuzberg district. "The problem is shifting and not to talk about it doesn't make it go away."
This is not The Boys from Brazil; individual addresses will not be made available to Nazi hunters, but rather a detailed digital map that shows far-right activity on a national scale, replacing a fragmented regional approach. The public will be invited to submit information, but this will have to be backed up with proof and verified. "It is difficult to monitor the situation all over Germany. There are regions where Nazi activities are very high," says Ulli Jentsch from Apabiz. "In some regions the authorities claim there is no Nazi threat because they don't want to have bad news about their town." Even the famously liberal streets of Berlin are only a few miles from far-right activity, says Matzat. The surrounding state of Brandenburg is often cited as a hot spot for neo-Nazi activity. "If you walk around in certain areas with coloured hair or you are a black guy, you are in danger."
© The Guardian Shortcuts blog
Twitter, Anti-Semitism, And A Quietly Important Free Speech Case
25/3/2013- Back in October, a hashtag on Twitter, #unbonjuif, went viral in France. It translates out to “a good Jew,” and, well, the French are just as prone to trolling as we are, so you can guess what happened next. Twitter took down the offensive tweets, but it didn’t go far enough for France: They ordered the company to fork over the identity of the offending Twits. Twitter has refused, arguing that as an American company, it doesn’t have to comply with French speech laws. So, far, French courts have disagreed, but there’s much more to this than just philosophical differences.
Twitter’s opinion is costing it money. A lot of money, in fact:
A Parisian circuit court ruled against the social network, giving it two weeks to comply or face a fine of up to €1,000 ($1,298) for each day it doesn’t. The Union of French Jewish Students want considerably more than that, says its president, Jonathan Hayoun, because the site “is making itself an accomplice and offering a highway for racists and anti-Semites”.
But the larger question is… legally speaking, who’s right here? We’ve ripped on Europe’s tendency to believe American companies should obey their tax laws before, but this one is murky to say the least. Basically, it’s not clear what law, if any, actually applies. If a French citizen posts something using an American company that is allowed under American law, where the entire world, including the French citizenry, can see it… does French law apply? Does American? If French law applies, does that mean the UFJS can go after any French citizen who posted something anti-Semitic… or anybody who posted something anti-Semitic?
Clouding the issue is that it’s not clear how seriously France takes the law. Its law against anti-Semitic actions and language is among the strictest in the Western world, but rarely enforced. Make no mistake, this is an important case. Most major Internet corporations are based in the United States, which has a high level of freedom of speech compared to other countries. Twitter winning or losing here is going to define how the Internet works for a long time to come. But either way, expect more cases like this: Until international communications law gets with the times, this is going to continue to be a problem.
Search Engine Scandal; Google Auto-Completes Your Racism
By Brian K. White
21/3/2013- Google offers an Auto-Complete feature. You start your search, they suggest what you might want to see, based on what others have searched… and Google thinks you’re a huge racist. I searched almost every country in the world and what I found was… well, it was interesting
to say the least. I wouldn’t go as far as to say “informative”, but it definitely speaks to prejudices, pre-conceived notions and bigotry-at-large.
This is the politics of search. Google is all about your bigotry and pre-conceived notions, and they’re not ashamed of it. North America – Caribbean – Central/South America – Europe – Asia – Africa – Down Unda’
Why are Americans so?
Well this one started off with a bang.
Here we go. Why are Americans so stupid, so fat, called Yankees, and my favorite, obsessed with guns. Honestly, all of these are good questions. Why are Canadians so?
Afraid of the dark? That’s a weird one. They live through a solid three months of it, I doubt that’s rational. So polite? I can see that. Not exactly racist, but interesting. Why are Canadians so rude to Americans? That’s not racist either, it’s fair. I’d guess it’s because Americans are so brash and obnoxious. SOURCE: I’m an American.
I did get a stern but polite talking to about George W. Bush’s policies at a gas station in Vancouver some years back. All I could do was assure the gentleman I didn’t vote for him. Why are Mexicans so?
This could get ugly.
Short? Brown? Loud? Rude? Oh, what a relief! Those things aren’t bad. Aside from the beige question, they’re all subjective. Caribbean Why are Puerto Ricans so?
Well this was a mixed bag. On the one hand you’ve got proud and beautiful, which are both good things to be. On the other hand you’ve got “why are Puerto Ricans so mean?” and “Why are Puerto Ricans so lazy?” I live in Puerto Rico
for a couple months as a travel writer, and I will say that doing my job was pretty difficult. Hard to get cooperation, most times. Not sure about lazy, though. Why are Haitians so?
Why are Haitians black? Why are they so rude, so loud and so mean? I guess these are all fair, albeit subjective questions. Why are Dominicans?
Well that was an easy one. Black, so black, loud and dark. I guess what we’re asking here is why they’re black. Central/South America
The continent(s) of love and cocaine. Here we go. Why are Brazilians so?
Not much to see here. The only bad thing is “Why are Brazilians so annoying,” which really isn’t so bad. Why are Chileans so?
Why are Chileans white? This question seems to come up a lot. People really seem to care what color people are, and why. Why are Venezuelans so?
This is a good example of positive stereotypes. Three about how beautiful their women are, one about how good their baseball players are. Why are Colombians so?
Here you might spot my typo, but Google caught it. Why are Colombian women so beautiful, or so easy? Why are they white? So many questions, so little brain power went into it. Seriously, what kind of answer were they hoping to get? Why are Bolivians so?
Why are they so ugly, fine, I can overlook that one. But “why are Bolivians always late for dinner”? Somebody is going to have to explain this one to me, because I guess I’ve never known a Bolivian. Why are Peruvians so?
Why are Peruvians short, okay, I get that one. The answer is malnutrition, in case you’re curious. You’ve only met ones who were raised poor, and thus malnourished, but that’s fine. Why are Peruvians Asian??? First of all, they aren’t, second of all, how many people had to Google this for it to come up as the third most common result for “Why are Perv-”? Why are Argentinians so?
Why are they so white and rude, I understand. Not that I agree (or have ever even met an Argentinian, but those come up for nations around the world… but chaotic and cocky are a bit mystifying. Europe Why are Swedish people so?
Why are the Swedish so happy, healthy tall, and just plain King? All good questions. Strange ones, but that’s fine. I wouldn’t say reverse racist so much as just complimentary and nice. Why are Norwegians so?
Why are the Norwegians happy or rich, fine… but why are they racist? Isn’t that a bit racist just for asking? Why are the Irish so?
Wow… that’s quite a list. Why are the Irish so short, red-headed, pale and Catholic? You don’t know your geography so well, people of Google. More Scots are ginger than Irish folk, only parts of Ireland are Catholic (there was a bit of a war about this, don’t know if you heard,) and everyone
in northern Europe is pale. Why are English People so?
Why are English people so pale, rude and tall… well, it could be worse. I know that for a fact because I also searched for the British, as seen below. Why are the British so?
Why are the Brit-, was as far as I had to type to get the questions; Why are British teeth so bad, why are British people so mean, why are British women unattractive and why are British people so rude… might have some PR work to do there, Britain. Why are Germans so?
Here’s another positive one. Why are the Germans so smart? Why are the Germans so successful? Why are the Germans so good at engineering? Well for all three of those I’d credit their educational system, with first-hand knowledge. I spent time in the German public school system as an exchange student, and I can tell you teachers, programs, and indeed even the students, are top notch. Why are the Dutch so?
Why are the Dutch so tall and happy I can understand. Both compliments, but not such strange questions… but why are the Dutch so orange I think I’ll never understand. Is John Boehner Dutch? Why are the Polish so?
Oh good, the Polish. Why are the Polish made fun of? Why are the polish considered dumb? And my favorite, why are the Polish represented as pigs in maus… I don’t even know what that means and I’m not about to Google it. Why are Ukrainians so?
Here’s another case of nothing but positives. Why are Ukrainian women so hot? Mila Kunis is Ukrainian, and might I say, meow. Why are Ukrainian women so beautiful? Why are Ukrainian women so thin? Why are Ukrainians so beautiful. Those are good questions, but the better question is; why are you not buying a ticket to go there right now? Why are the Spanish so?
Why are Spanish people so annoying? Why are Spaniards so racist? Why are Spanish girls so easy? Too many questions, too many qualifiers needed. The better question is who you are and why Google masses are looking to do this. Why are Italians so?
This ones about 75% racist. Why are Italians called wops? Why are Italians called Guineas? Why are Italians short? First of all, where do you live that they are called these things, because on the West Coast, I’ve never heard such epithets. And short? Guess it depends who you know. Why are Greeks so?
Why are Greeks dark-skinned? Why are Greeks so arrogant? Why are Greeks so rude? They’re not dark-skinned, they’re olive-skinned, and pardon me, but it’s beautiful. Tear me up in the comments below if you like. Why are Turkish people so?
Why are Turkish men so pervy? Wow. Really? Enough people are Googling this exact phrase (I’ve never heard the word “pervy” before) that it comes up as the #2 result? Why are Jews so?
I know Israel isn’t in Europe, but it’s the only country in the Middle East that actually showed up properly in the search. Number one result: Why are Jews so cheap? I have a modest amount of experience with Jews in my life, and as a non-racist, I’m happy to not comment on this search result. Asia Why are Russians so?
Kind of a mixed bag here, but not so bad. Why are the Russians always the bad guy? Why are Russians so tough? Why are Russians so good at chess? You tell me. Why are Tibetans so?
Why are Tibetans always setting themselves on fire? Hey man, it’s a solid question, if a sad legacy. After all, immolation is the sincerest form of flattery. Perhaps not better than result #4, why are Tibetan mastiffs so expensive? Why are the Chinese so?
Why are the Chinese so smart and skinny? I assume the answer is Tiger moms, but as to why are the Chinese so good at ping-pong… You lost me. I have no guess, educated or otherwise. Why are Japanese people so?
Why are Japanese people so smart and cute, fair enough, that’s kind and complimentary. Then we ask TWICE
why they’re so weird. If you’ve seen their adult entertainment, perhaps you can understand why the question is so globally relevant. Why are Koreans so?
Why are Koreans so skinny, angry, pretty and rude? It doesn’t sound so much like you have a question, it sounds like you have a new girlfriend you can’t wrap your head around. Why are North Koreans so?
We just had to break the Koreas out (sorry lil’est Kim.) Why are North Koreans starving and short? Well, my guess to the latter is because of the former. Why are South Koreans so?
Why are South Koreans so tall, rude, tall and racist? I’m seeing a trend emerging, it seems almost all races are potentially racist. Why are Vietnamese people so?
This one is a tad uglier. Why are Vietnamese called gooks? Why are Vietnamese people so short? Not cool, Googlers of the world. Not cool. Why are Cambodians so?
Yes, why are Cambodians “ghetto”? And why are they “dark”? Why are Thai people so?
Why are Thai people so gay, nice, rude and happy? Well if I had to guess, I’d say it depends on the tip. Why are Taiwanese so?
Why are Taiwanese girls so ugls, so pretty, so skinny and so easy? It’s almost as if Googlers think of these women as little more than objects upon which one should ogle. Not that I disagree with that, that would be racist of me. Why are Filipinos so?
Why are Filipinos so short, so cool, their parents so strict, and they mostly end up as nurses… this is just a weird one. Why are Indonesians so?
Why are Indonesians Muslims? Why are Indonesians so short? Why are Indonesians so angry? And my favorite, why are Indonesian Chinese so rich? Why are East Indians so?
I had to use the phrase “east Indian” rather than “Indian” because failing to do so brought up results specific to Native Americans. But once I narrowed it down, it got pretty weird. Why are East Indians so cheap? Why are East Indians so rich and so rude? My personal experience does not reflect these search results, but obviously I’m in the majority. Africa Why are Egyptians so?
Yes. Why are Egyptians considered white? And why are Egyptians “not black”? Why are Libyans so?
I assume when they ask why Libyans are revolting, they don’t just mean violently disgusting. But why would someone ask “why are Libyans white”? Why are the Sudanese so?
We’ve got the old “Why are Sudanese people so dark,” which we can blame on simple racism, and “Why are Sudanese people so tall,” which we can blame on simple ignorance, but what about “Why are Sudanese confused about the word AIDS”? That’s a fairly strange one. Why are Algerians so?
Why are Algerians white (run of the mill question, at this point,) but the next three questions are about Bulgarians… not sure what that means. Why are Ethiopians so?
Having seen enough cry-for-help infomercials, I suppose it’s fair to ask “Why are Ethiopians stomach’s so big”, but then to ask “why are Ethiopians so fast” is perhaps a bit puzzling. Why are Nigerians so?
Yes, why are Nigerians so smart, so corrupt, so dark and so loud? These are four fair, honest, weird questions. Loud? Really? Why are Kenyans so?
Well this one is uncommonly homogenous. Why are Kenyans so fast? Why are Kenyans fast runners? Why are Kenyans such good runners? Why are Kenyans fast? I guess the whole of the Googling world has one opinion of Kenyans. Could be worse. You could be the British. Down Unda’ Why are Australians so?
Why are Australians are so hot or rude, I understand, but to ask why they are so white? Seriously? Why are Australians white
? And the capper on it, “Why are Australians upside down?” Well that just defies common sense. Come on Googlers of the English speaking world, I expect more from you. Why are New Zealanders so?
Why are New Zealanders rude? So nice? Called Kiwis? Good at Rugby? Nothing to see here, folks. Move along. Conclusion
Whether you are or not, Google certainly assumes that you are a racist, and if I may say so, something of a weirdo. Don’t take it personally, it just means that your a member of a species that is largely racist, at least when searching the internet from the comfortable anonymity of their homes. Methodology: This was a tad less scientific than one might hope. I did 1-3 searches for each nation: “Why are X,” “Why are the X” and “Why are the people of X”, and selected the most interesting results. As such, these results should NOT be considered scientific, but anecdotal at best… now go try to find such a disclaimer from CNN or FOX when they do the exact same thing.
© Glossy News
Kenya petitions Facebook and Twitter over senders of hate speech
20/3/2013- Although Kenyan general elections earlier in the month were generally peaceful, the Kenyan government is petitioning Facebook and Twitter in a move aimed at pushing the two social media networks to uncover the identities of those perpetuating hate speech in the country. The decision by Kenya is in itself likely to be viewed as hate for social media networks in a region where many countries are already planning to ban or regulate the use of social media networks as well as online news media. The rise in the posting of hate messages comes just two weeks after the East African country successfully held peaceful elections that were won by Uhuru Kenyatta, currently facing charges of crimes against humanity at the International Criminal Court in the Hague.
It's not clear yet whether Kenya will move to crack down on social media networks and online news media, which have become a conduit for people expressing their anger and feeling over governance issues in the region. Zambia and Malawi are already moving to close down and more closely monitor online media. The hate for online media comes in the wake of popular uprisings in Africa and the Middle East last year that were mainly coordinated through social media networks. However, critics say the move to regulate social media networks and to shut down some online news media poses a threat to the growth of the Internet in the region. The Permanent Secretary for the Ministry of Information and Communications in Kenya, Dr. Bitenge Ndemo, said this week, "The government will pursue the hate mongers vehemently. We will petition Facebook and Twitter to uncover the identities of people perpetuating hate speech."
Hate speech through social media networks caused the death of over 1,200 people and the displacement of over 600, 000 in Kenya's 2007 disputed general elections, it is said. Africa is experiencing an explosion in the number of online media organizations many of which are accused of promoting hate speech and racism. Zambian president Michael Sata has already ordered the Zambia Information and Communication Technology Authority (ZICTA), the country's telecommunications regulator, to close down at least some online media outlets in the country, claiming they are promoting hate speech.
In Malawi, the government has drafted a law, the E-Bill, which seeks to regulate and control online communications including social media networks in the country. The bill would require that editors of online publications make known their names, addresses and telephone numbers in addition to other information. The E-Bill further introduces the concept of government-appointed cyber-inspectors, who would have the powers to, among other duties, monitor and inspect any website or activity on an information system in the public domain and report any unlawful activity to the regulatory authority. Last year's report by the U.N. special rapporteur on racism Mutuma Ruteere urged African countries to implement measures to combat online extremism by websites that promote hate speech, but without curbing freedom of speech.
© CIO Asia
French anti-racism groups sue Twitter for $50 million
Following a wave of racist and anti-Semitic posts on Twitter in 2012, two French organisations have filed criminal charges against the US-based website for refusing to hand over the details of users violating French law.
21/3/2013- Two French anti-racism organisations on Thursday filed a criminal suit in a French court against US website Twitter, demanding a 38-million-euro fine (50 million dollars) for refusing to hand over data relating to French users who make racist comments online. The move is an escalation in the legal effort by the French Jewish Students Union (UEJF) and the “J’Accuse” organisation, who took the case to a civil judge in January following a wave of racist and anti-Semitic posts. In October, Twitter users had posted comments using hashtags [used to group posts according to themes] such as #UnBonJuif, meaning “A Good Jew”, which became the third most popular in France at the time. While many of the posts using this and other hashtags defended Jews, a significant number were overtly racist and in breach of French laws against racism and anti-Semitism. The groups had asked Twitter to provide data that would allow the identification of racist users of the site. They also demanded that Twitter put in place a system for alerting against racist or anti-Semitic posts.
‘Designed to make Twitter wake up’
Twitter was given two weeks to respond to the demand to the civil complaint. The anti-racism groups’ lawyer Stéphane Lilti told FRANCE 24 there had been no response, leaving his clients no option but to launch criminal proceedings. “We are upping the stakes because Twitter has not been listening to the fact that they have to abide by French law,” Lilti said. “The 38 million euros cited, which is [the equivalent of] 50 million US dollars, is designed to make them wake up to the fact that protecting the authors of racist tweets is not acceptable in France.” Lilti explained that from his clients’ point of view, Twitter was hiding behind US media law, which it argues does not apply in France.
“We are not against Twitter,” the lawyer insisted. “This action is purely aimed at people who write racist comments online. But if we are going to stop this kind of online behaviour in France, multinationals like Twitter have to abide by French law, and not hide behind the US First Amendment that guarantees freedom of expression.” “Twitter wants to be known as a bastion of free speech,” he added. “But you can’t apply US legal standards in all jurisdictions, such as France, where there are laws prohibiting the publication of racist or anti-Semitic comments.”
Twitter to appeal
Twitter, which had removed the offending posts after they went online at the end of 2012, has 15 days to respond to the criminal complaint. On Thursday the site said it would appeal and the case is expected to go to court in September. In a statement, the US-based site said it was “in constant contact” with the UEJF who it said were “sadly more interested in grandstanding than taking the proper international legal path for this data.” Twitter cannot control the vast number of Tweets that get posted by its users every day, but if complaints are made against users and their comments, those posts can be removed and accounts suspended. In October last year, Twitter used a country-specific filter in place in Germany to suspend the account of a banned far-right group in the city of Hannover.
© France 24
Major report into Racism on Facebook
The Online Hate Prevention Institute released a major new report into antisemitism on Facebook to coincide with the International Day for the Elimination of Racial Discrimination on March 21st 2013.
21/3/2013- The new report tracks the response to a number of antisemitic items on Facebook. Some of the items were included in OHPI’s previous report in 2012 into Aboriginal Memes and Online Hate, others are new in 2013. Facebook pages can provide a home for racism and facilitate the creation of new virtual communities based on hate of specific minorities. Facebook pages can also serve as archives for hateful content that can be easily found, shared, and spread. Hate pages on Facebook pose a danger to the social cohesion of society and due to their low entry barrier, significantly facilitate the spread of racist content.
This report tracks the response by Facebook to a catalogue of antisemitic content over a period of time. The report highlights that there are ongoing problems with antisemitic content at Facebook. One problem is that Facebook appears unable or unwilling to recognize certain well known kinds of antisemitic content as hate speech. Another problem relates to a lack of quality control. This report clarified the issues and makes significant recommendations to Facebook of ways they can make more effectively implement their existing policy against hate speech.
Significant forms of antisemitism which Facebook appear unable to recognize as hate speech include the use of some propaganda very closely related to that used by the Nazis during the second world war, for example, the imagery of Jews as rats that need to be exterminated. Another example are Facebook pages promoting the Protocols of the Elders of Zion, and antisemitic forgery that has been used to justify pogroms and genocide. Complaints by users about the use the Nazi symbolism in relation to the State of Israel are also routinely dismissed by Facebook.
This report’s broad conclusion is that the standard reporting tools available to all Facebook users, and the review by front line staff in response to these reports, has a significant failure rate. New processes, including a review of complaints that are initially rejected, are needed in order to better respond to the problem of online antisemitism and online hate more generally. The report suggests a process of continual improvement be adopted by organisations like Facebook in their efforts to combat the proliferation of hate speech on their platforms.
© The Online Hate Prevention Institute
Ex Norway PM launches anti cyber xenophobia campaign
Council of Europe Secretary General Thorbjørn Jagland will be the opening the ‘No Hate Speech Movement’ at Strasbourg’s Palais de l’Europe, Friday.
20/3/2013- The Council states that cyberspace is inundated by an overflow of xenophobia, intolerance, and discrimination. “Hate speech online has recently turned into a major form of human rights abuse, with serious consequences online and offline,” they say in a statement. Representatives say that today’s societies face a pressing challenge due to increased spreading, inciting, justifying, or promoting expressions of hatred amongst young people on the Internet. “Prejudice based on aggressive nationalism and ethnocentrism, hostility against minorities, bigotry on the grounds of sexual orientation and gender identity, anti-Semitism, misogyny, christianophobia, cyber-bullying, anti-gypsyism and islamophobia: the potential negative impact of online communication on democratic development has caused several reasons for concern,” the statement reads.
As an answer to pressing threats, the Council of Europe has established an initiative, the ‘Youth project on combating Hate Speech Online’. It is aimed at giving young people and youth organizations with the necessary skills to identify and fight racism and discrimination appearing in online hate speech. Young bloggers and youth organizations will receive training at the European Youth Centers in Strasbourg and Budapest “with an innovative capacity building approach on activating sane social networks’ communities. Friday’s event, aimed at changing attitudes and mobilizing people to uphold human rights online, will be combined with national youth campaigns starting in 33 Council of Europe states, with EEA Norway Grants and voluntary contributions from Finland, and the French-speaking Community of Belgium.
© The Foreigner
New media enables rampant Islamophobia (USA, opinion)
By Nat Sowinski
12/3/2013- The dubious nature of Internet scholarships and Web journalism reflects a growing sense of anti-intellectualism in the United States. From blatantly incorrect articles to subtly racist undertones, the Web poses a substantial opportunity for those with an agenda to promote their ideas and agendas disguised as objective journalistic truth. On the Internet, we discover that often, speculation and opinion are touted as fact. Nowhere is this more apparent than in U.S. scholarships, literature and formal and informal news media surrounding “Islamic radicalism,” “Islamism” and the notion of jihad. The post-Sept. 11 world has seen the emergence of a widespread strain of xenophobia and racist fear, particularly targeted toward Muslims, Arabs and Arab-Americans. There has been growing sense of trepidation among many Web authors regarding a possible “Islamization of America.” The Internet provides Islamophobic pundits favorable conditions for their message — and agenda — to disseminate and gain readership.
Possibly the most dangerous facet of this rise of Web anti-scholarship is its manifestation of anti-Islamic messages, which carry with them strong suggestions for directing American foreign policy. Such Islamophobic pundits have unfortunately found their way into the Targum. As you have likely seen — especially from the huge reaction on this opinions page and elsewhere — the Targum last Tuesday included a full-page advertisement for “Islamic Apartheid Week,” which was purchased by the David Horowitz Freedom Center. I wish to elucidate some details about David Horowitz, his organization and the concerted effort to insert Islamophobia into American discourse. The David Horowitz Freedom Center is a conservative think tank founded by political activist, and notorious racist Horowitz. The Southern Poverty Law Center identified the David Horowitz Freedom Center — formerly known as the Center for the Study of Popular Culture — as one of 17 “right-wing foundations and think tanks support[ing] efforts to make bigoted and discredited ideas respectable.”
Much of Horowitz’s efforts are focused on the demonization of Islam and Muslim-Americans. The Center for American Progress conducted a recent report entitled “Fear Incorporated: The Roots of the Islamophobia Network in the United States” that cited Horowitz as a prominent figure in the demonization of Islam. He is also a figurehead in the astroturf (read: artificial grassroots) movement alleging Islam as conspiring to “take over Western society.” Horowitz, in response, laughably accused the CAP of capitulating to the Muslim Brotherhood. Horowitz has published previous ads in school newspapers similar to the one published in ours. Horowitz ran an advertisement in 2008 in the Daily Nexus, UC-Santa Barbara’s newspaper, alleging that their Muslim Student Association has ties to known terrorist organizations. On Al-Jazeera, he stated that “The [MSA] pretends to be a religious organization, while it is really an arm of the Muslim Brotherhood … Hamas and Hezbollah.”
The reality is quite the contrary.
I believe this spurious pseudoculture of demonizing Muslims and Horowitz’s condemnation of MSAs have contributed deeply to racist sentiment against Muslims in the United States and have brought about concrete consequences. The New York Police Department’s recent spying of our very own MSA is, I think, a testament to this. The CAP report also included criticisms of a group named “Stop Islamization of America,” which was founded by noted Islamophobe Robert Spencer and leading “counterjihadi” activist Pamela Geller. Geller became known for spearheading the movement against the proposed construction of an Islamic community center near the former site of the World Trade Center. Geller sat on the panel for the “Islamic Apartheid Week” event at Temple University, an event that justifiably spurred outcry from Temple’s Muslim community.
“Temple University is not unique … the reception we got at Temple was just a snapshot of what America will look like in just a few years if we don’t stand up and go on the offense,” she said. The quote is indicative of a concerted effort to play the role of provocateur and scapegoat Muslims across America. I’d rebut that Geller and Horowitz are not unique. They are simply members of a larger national effort to justify hateful attitudes toward Muslims in the United States — and to justify intervention abroad. It is crucial that we raise our voices and speak out against this kind of racist, anti-Muslim hatred in the United States and on campus at the University.
Nat Sowinski is a School of Arts and Sciences senior majoring in Middle Eastern studies and minoring in philosophy.
© The Daily Targum
Is There Racism on YouTube? Black Content Creators Speak Out At SXSW
12/3/2013- DeStorm Power, Black Nerd Comedy, Kingsley and Shanna Malcolm are some of the most prominent black YouTube stars out there. They all have the power to entertain and to make their fans laugh in ways that they might not have been able to do in mainstream media. While the YouTube partner program has made it possible for people of color to make a living off of producing content for mass audiences by bypassing the traditional media model, black YouTube stars are still a small minority. DeStorm Power and Kingsley are the only two black content creators to be in the top 100 most subscribed YouTube content creator list, which is mostly white and male. A recent SXSW panel discussed the struggles of content creators of color and how they handle racism on YouTube. The website Color Lines went out to SXSW this week and interviewed panelists and YouTube stars Franchesca “Chescaleigh” Ramsey and Andre Meadows of the Black Nerd Comedy YouTube channel about racism on YouTube and how black content creators on YouTube have to work harder than others in order to be seen and heard.
© New Media Rockstars
Socially Conscious Anti-Semitism (USA)
Cory Booker’s new website faces difficulty filtering out racist videos
7/3/2013- Democratic Newark Mayor Cory Booker’s new and much-hyped video-sharing website Waywire bills itself as a more serious and socially conscious version of YouTube, which will help users circumvent “all the junk” posted on crowded sites like YouTube and Vimeo, but one thing the website has not been able to filter out yet is anti-Semitic videos. Waywire, which was founded by Booker, former Gilt City president Nathan Richardson, and former TechCrunch executive Sarah Ross is still in its pre-launch Alpha phase and only recently opened up registration to the public. But users have already posted an array of clips that justify the Holocaust, blame the Sept. 11 attacks on Israel, and allege a Jewish monopoly of the media.
Searching for videos with hashtags like “Israel,” “Jewish” or “9/11” brings up a number of these clips, including “Bill Maher Agrees with Farrakhan! Jews Run America!,” “Why Did the Germans Dislike the Jews?,” and “American Mass Media is controlled by Zionists!” While it is often difficult for social networking websites to remove content deemed offensive, sites like YouTube, Google, and Apple have recently taken steps to try to weed out anti-Semitism from their social networking platforms.
YouTube removed hundreds of videos last August after the Online Hate Prevention Institute released a report documenting alleged hate speech and anti-Semitism on the website. However, plenty of similar videos remain—in fact, many of the ones posted on Waywire are linked directly from YouTube. It is unclear what role, if any, Booker plays in the day-to-day operations or strategic planning of the business. When asked whether Booker would make an effort to remove objectionable content from Waywire, his office said “We have reviewed your request for comment from Mayor Booker on this issue, and we are declining comment at this time.”
Because of the sheer number of videos uploaded to sites like YouTube, staffers tend to rely on users to bring inappropriate or offensive clips to their attention by “flagging” them. While Waywire does not appear to have a similar system in place at the moment, it also has far less content to deal with. For example, searching for “Israel” on YouTube brings up over 1.2 million video results, compared to just 451 results on Waywire. Waywire has also tried to differentiate itself from YouTube by branding itself as more socially conscious and committed to positive change.
“Cory Booker, Nathan Richardson, and Sarah Ross saw that the Internet was full of places where young people could watch funny animal videos or clips from reality shows,” Waywire said in a statement posted to the site. “But there wasn’t a place to talk about the serious things which affect you as citizens. Things like getting a job, economic fairness, politics, and local and world events … [Waywire] is your forum for sharing those ideas.” Critics say this mission statement clashes with the site’s actual content.
“It’s difficult to understand how the new video sharing site can be touted as an attempt to ‘elevate the global conversation’ when the first page of videos you see upon typing the word ‘Jew’ includes clips of Louis Farrakhan claiming Jews control the world, another evoking conspiracy theories based on the Rothschilds family, and another featuring Holocaust denier Ernst Zundel,” said Adam Levick, managing editor of CiF Watch, a website that monitors online anti-Semitism. Others question the extent to which social media sites should be held responsible for offensive content posted by users, as long as they are not actively endorsing or encouraging it.
Internet freedom activist and New America Foundation fellow Marvin Ammori said websites like Waywire have the right to curate their own content to remove postings they find objectionable. But he added that Waywire “shouldn’t be liable for the speech of its users generally,” even if it was cofounded by a prominent politician and potential 2014 Senate candidate like Booker. “Let’s say that you’re a politician and you happen to be an investor in a phone company, and people use the phone to conspire to break into a bank or something. The politician has nothing to do with that,” said Ammori. “If people think of Waywire as just another platform where people build a community with one another, then you can’t really blame Cory Booker for the stuff people put up there.”
Omri Ceren, a senior advisor at the Israel Project who has advised the Global Forum for Combating Anti-Semitism on issues of online anti-Semitism, said a website like Waywire cannot expect to launch until it has the resources to monitor content and enforce its own terms of service. “Organizations need to be willing to devote resources to monitoring hate speech in the same way that they would devote resources to making sure videos play correctly,” said Ceren. “Content moderation isn’t something that’s ‘extra’ to an online community: It’s one of the core elements of community management.” Ceren also noted that Waywire’s terms of service (TOS) prohibit content that may be “harmful, fraudulent, deceptive, threatening, abusive, harassing, tortious, defamatory, vulgar, obscene, libelous, or otherwise objectionable.” “Either the site doesn’t have the resources or infrastructure necessary to enforce its TOS, in which case it wasn’t ready to launch, or there’s something much more problematic going on,” he said.
Waywire did not respond to requests for comment as of publication time.
© The Associated Press
Councillor under investigation on three fronts over Islamophobia allegations (UK)
6/3/2013- The councillor kicked out of the Conservative group after Islamophobic comments were posted on his Facebook page is now facing three separate investigations. Chris Joannides, who represents Grange ward, has been mired in controversy after comments and images which mocked Muslims appeared on his Facebook page. After barring reporters from a councillor conduct meeting two weeks ago, the local authority has confirmed it has received complaints about Mr Joannides and an investigation is being carried out. However, council chiefs were unable to confirm when meetings held to scrutinise the councillor’s conduct would be held in public. A council spokesman said: “The matter is under investigation and we cannot comment further.” Enfield police have also confirmed that an investigation is being carried out into the disgraced councillor’s actions. A spokeswoman said they were looking at “an allegation of internet-based hate crime”.
Enfield Southgate Conservative Association held the first stage of its own investigations into accusations against the councillor last night. Speaking before the meeting, a spokesman from the association said: “We will meet to see if we will be taking the investigation further. “At this stage we have received no complaints from residents or members of the public.” Although the Conservative group on Enfield Council has removed the whip from Mr Joannides, he is still a member of the national Conservative Party. The Enfield Southgate association will decide whether further action to remove Mr Joannides from the national party should be taken. David Burrowes, Conservative MP for Enfield Southgate, refused to comment on Mr Joannides’ future within the party. He said it was a matter for the association to investigate. He added: “His future within the Conservative party is for the Enfield Southgate association to decide and I would not want to jeopardise investigations by commenting further at this stage.”
Mr Joannides denies being an Islamophobe.
© North London Today
Norway far-right group profiles Muslims
A list of hundreds of Muslim companies and organizations has been published on the website of anti-Islamic group NDL, an organization to which Anders Behring Breivik had membership.
5/3/2013- The Norwegian Defence League (NDL), an anti-Islamic group closely associated with the English Defence League (EDL), has published a list of hundreds of Muslim companies and organizations on its website. The list was assembled based on listings in the Public Entity Registry. Lars Johnny Aardal, deputy leader of the NDL, stated the purpose for its publication was "to show the extent of Islam and Muslims in Norway".
As a prologue, the NDL wrote:
"The list is far from complete, and a longer list can be prepared with multiple keywords or keyword changes we have used. We have only included entries under the keywords "Islam, Kurdish, Turkish, Muslim, Iranian, Iraqi, Somali, Pakistani, Arabic, Mohammed, Ali and Hussein."
Terje Emberland, a senior scientist at the Norwegian Center for Studies of Holocaust and Religious Minorities, told Norwegian press that a list of this sort had only been published once before in Norwegian history when an anti-Semite compiled a list of Jewish businesses in the 1930s. Emberland stated that "In this way, the NDL clearly exposes its character, and aligns with the fascist and racist tradition to which it belongs." Khamshajiny Gunaratnam, Deputy of Ungdom mot rasisme (Youth against Racism Organization), rewsponded to the list by saying: “I also thank NDL! They give me a good idea of the strength of the Muslims for the Norwegian society. Those people organize themselves, start businesses and participate in society. Oh, what a delight!”
Inspired by the EDL’s foundation in 2009, the NDL was eastblished around the end of 2010 and the start of 2011. Despite leadership conflicts between factions in early 2011, the group was eventually led by Lena Andreassen for approximately a month before being discharged after an unsuccessful April 9, 2011 demonstration. After the July 22, 2011 attacks in Norway, it was disclosed that attacker Anders Behring Breivik had been a prior NDL member under the pseudonym "Sigurd Jorsalfar," a name derived from medieval Norwegian crusader-king Sigurd the Crusader.
Breivik’s December 6, 2009 forum post on the Norwegian website Document.no is also the first documented proposal regarding the establishment of a Norwgian organization along the lines of the EDL. Norwegian media disclosed that a late August investigation of 2011 local electoral lists exposed that eight Norwegian politicians from five parties had membership to the NDL internet forum. A secretly recorded informal conversation revealed that two mayoral candidates from the right-wing Democrats party had discussed killings during gathering arranged by the Stop the Islamisation of Norway group in February 2011 in Oslo.
Håvar Krane, mayoral candidate in Kristiansund, expressed his desire of "putting a Glock in the neckhole" of Norwegian Foreign Minister Jonas Gahr Støre and "blocking all the exits with Molotov cocktails" during the government cabinet’s Christmas dinner to Kaspar Birkeland, mayoral candidate in Ålesund. Krane had been an NDL leader during a transitional period for three weeks.
© World Bulletin
The Only Earthling With a Facebook 'Dislike' Button
3/3/2013- About 15 years ago, Chuck Rossi became what’s known in Silicon Valley as a release engineer. This is the person tasked with gathering up all the code written by a company’s many engineers and making sure it works together as a whole. Rossi coordinates the process for looking over code for bugs, chatting with engineers about their work and deciding which new features are ready to get baked into a particular version of a product. It should be noted that Rossi is not just a release engineer. He is THE release engineer. He’s done this job at VMware (VMW), Google (GOOG), and now Facebook (FB). He may well know every single person in the closely knit world of Silicon Valley release engineers. As noted in my cover story this week, the work of Rossi and his team stand as one of the major reasons Facebook has been able to handle a billion users.
So how does one become a release engineer? Well, in Rossi’s case, it was a conscious decision. “It’s like plumbing,” he says. “It’s not the most glamorous thing in the world, but I realized that if you’re good at it, you could go to any software company in the world and they would say: ‘When can you start?’” Rossi’s main job is to oversee the Push, a daily exercise in which Facebook takes in hundreds of changes to its code from engineers, checks to make sure they’re good, and then adds them to Facebook.com. Over the years, Facebook has built a number of software tools that do the first round of checks on the code, leaving Rossi to manually inspect the additions with potential to cause the most problems. Ritual surrounds the entire process. Rossi, for example, has a bar to the left of his desk—a serious, full-on bar packed with scotch and tequila and whatever else you may want. The bottles are bribes from engineers trying to persuade Rossi to incorporate their changes into the Push.
There’s also what’s known as Push Kharma. This stems from profile pages tied to each engineer. Rossi can pull up someone’s name and see what code they have submitted. “Every developer is born with four stars to their name,” he says, pointing to a ranking system on the profile page. “If we have an issue when we take someone’s code—and it blows us out of the water—then it takes them down half a star and I write what happened,” Rossi says. The system has a thumbs-down indicator, a feature many Facebook users have long sought. “I am the only guy who has a’ dislike’ button on Facebook,” he says. “A lot of people want [one], but this is the only place you will see it.”
When engineers drop down to two stars, they’re banned from making changes until they’ve completed a review and a retraining process. “People here are pretty freaked-out about losing their stars, but not in a bad way. It’s all done in good fun,” Rossi says. And if you catch an error before it goes up on the site and jump in to save the day, you can earn a half-star back. Failing that, he jokes, “you can bring booze or cupcakes.”
© Business Week
Milwaukee Jewish group launches anti-Semitism reporting site (USA)
1/3/2013- Milwaukee’s Jewish Community Relations Council, concerned that bias toward and harassment of Jews is underreported, has launched a new online reporting system at www.milwaukeejewish.org/antisemitism. The Anti-Semitism ReportLine comes as the council relases its 2012 audit of anti-Semitic incidents. This year's report includes 12 reported incidents, most of them verbal and written expressions and one case of vandalism of public propert. “Anecdotal evidence suggests that many expressions of anti-Semitism go unreported,” said council Director Elana Kahn-Oren in announcing the results of the audit. “Specifically, we have heard about continued anti-Jewish harassment and verbal expressions among middle and high school students. That often takes the form of jokes or teasing, which can be challenging for students to counter appropriately,” she said.
In the coming year, Kahn-Oren said, the council will be helping students identify and respond to anti-Semitism, which began with a seminar last year led by the Anti-Defamation League. “Throughout the program students told story after story about ‘harmless’ jokes and comments that left them feeling hurt and powerless,” she said. Those included several incidents of students throwing coins and waiting for the Jewish students to pick up the coins, a reference to the stereotype of Jews as cheap; and one student nicknamed "Jew" throughout middle school.
Other trends in the 2012 audit:
The “disturbing” use of Holocaust and Nazi language and imagery. They included a UWM student leader dressed as Adolf Hitler at a party where he posed for photos, including one in which he points into an open oven; and a Nazi slogan and dollar sign carved into a tree on the Oak Leaf Trail. Kahn-Oren said a third of all reported incidents referred directly to the systematic state-sponsored persecution and murder of approximately 6 million Jews by the Nazi regime. Derogatory emails and printed literature, distributed in communities throughout the greater Milwaukee area, including Oak Creek, South Milwaukee, Milwaukee, Shorewood and Whitefish Bay. In one case, the literature urged readers to reject diversity and included such comments as “The Jews who own the U.S. government have declared war against white America.” An email comment to a local blogger said Hitler “saved the German gentiles from your kind” and "get the (expletive) out of America, you Polish piece of (expletive).”
© JS Online
Hate videos on YouTube inspired Birmingham terror plot (UK)
28/2/2013- The teachings of a radical preacher who is believed to have inspired three Birmingham men convicted of planning a major terror attack are still available on YouTube, MPs have warned. But quizzed by Midland MPs in the House of Commons, a YouTube executive warned it was impossible to prevent the films being distributed. The volume of material uploaded to YouTube every day meant staff had to wait for members of the public to alert them to offensive or illegal material before it could be removed, said Sarah Hunter, Head of UK Public Policy for YouTube owner Google. She was quizzed by the Commons Home Affairs Committee, which includes Birmingham MP Steve McCabe (Lab Selly Oak) and Black Country MP David Winnick (Lab Walsall South). Mr Winnick warned that YouTube was providing a platform for “any hate merchant”. It followed the conviction of three men from Birmingham who were found guilty of planning a massive suicide bombing campaign that would have caused more deaths than the July 7 London bombings. The trio that they will all face life in prison when they are sentenced in April or May. Irfan Naseer, 31, from Sparkhill, Irfan Khalid, 27, from Sparkbrook, and Ashik Ali, 27, of Balsall Heath, were found guilty of planning the UK attacks after a 14 week trial at Woolwich Crown Court.
The trial was told that Irfan Khalid was recorded telling the other men to listen to the preachings of Anwar al-Awlaki, adding: “Trust me if you listen to it, it will soothe your heart.” Committee chairman Keith Vaz revealed he had searched YouTube and found videos of sermons by al-Awlaki, the former head of al-Qaida in the south Arabian peninsula. Ms Hunter said: “It is worth remembering the scale of content on YouTube. There’s 72 hours of content uploaded onto YouTube every minute of the day. So it’s just physically not possible to us to look at every video that gets uploaded. We rely on our users – when they tell us there’s content that breaks the guidelines, that’s when our team kicks in, reviews it and removes it.” Mr Winnick said: “It’s not simply the rantings of the person mentioned, the cleric mentioned, but other incitement to hate crimes, certainly against Muslims, anti-Semitism and the rest – you say matters are flagged up when complaints are made, my question is before complaints are made what sort of controls is there to try to ensure hate crimes, incitement against people because of their racial origin or religion or sexuality, doesn’t go on?” He added: “So it’s open really to any hate merchant, until hopefully somebody flags it up very quickly?” Ms Hunter said that objectionable content could be removed within an hour if it was “flagged” by users.
MPs also heard from Sinéad McSweeney, Director of Public Policy in Europe, the Middle East and Africa for internet messaging service Twitter. She explained that Twitter had a deliberate policy of not removing objectionable material unless it was actually illegal, in which case it could be blocked in the specific country where it was against the law. For example, pro-Nazi material might be blocked in Germany, where it was illegal, but not elsewhere. Mr Winnick asked: “You see, if someone on Twitter said Hitler was right or the Holocaust never occurred, which is not a criminal offence in this country, there’s no reason why it should be, or a rape victim ‘asked for it’ - a very crude description and absolutely disgusting – until someone complains, it remains on Twitter, is that right?” Ms McSweeney said the material was unlikely to be removed even if there was a complaint. She said: “Those events, those instances, don’t just happen on Twitter. People stand up in football stadiums and hurl racial abuse at players on the field. “Those around them will call them out on that. And similarly on Twitter, rather than Twitter decided as a corporation or a bunch of individuals whether that is good or bad, our approach is that other users on the platform will decide what is good speech and what is bad speech.”
Mr McCabe also challenged the policy, asking: “It sounds like you’re dangerously close to describing yourself as the innocent arch facilitator and that Twitter trolls are the responsibility of everybody else, and that cyber bullying is the responsibility of those who do it. “I don’t deny their responsibility but it does seem to me that they’re able to do it with enormous reach because of the service you provide, and if that results in a youngster deciding to take his own life or in some other tragedy... surely if it results in that you’ve got to go back and examine what you do, and decide what more you can you do to control this thing you have unleashed.” Ms McSweeney said: “On the safety side, not only do we have a set of rules by which users’ behaviour is measured and observe the laws of the country, we also have a hugely densely populated safety centre with advice for parents, teens, teachers. “And just as the ‘bad speech’, as you would term, it reaches millions of people, those safety messages, that advice, those resources that are there to help people experiencing depression or mental health difficulties, are also there on all our platforms and accessed by individuals who are vulnerable.”
© The Birmingham Post
Online bullying, hacking, fraud affect one in eight (Netherlands)
1/3/2013- Some form of cybercrime affected one in eight people in the Netherlands last year, according to the latest security monitor published by national statistics office CBS. The CBS definition ranges from online bullying to hacking and phishing. Online bullying accounted for one in four cases. Half of cybercrime victims had their computers, smart phones or email accounts hacked. The monitor also showed one in five people had experience of traditional crime - such as violence, theft and burglary - last year, and one in three people feel unsafe from time to time.
© The Dutch News
A Web of hate: European, U.S. laws clash on defining and policing online anti-Semitism
24/2/2013- Last October, the hashtag #unbonjuif (#agoodjew) was trending as the third-most tweeted subject in France. Users jumped on the chance to tweet phrases like “a good Jew is a dead Jew,” ultimately forcing the French Jewish students’ union (UEJF) to file a lawsuit against Twitter for allowing that content to appear. When a French court decided this January that Twitter must reveal the identities of users who sent out those anti-Semitic tweets, a cross-continental debate ensued on the difficulty of defining and policing anti-Semitism online. The French incident was hardly the first case of hate in social media and on the Web. The Simon Wiesenthal Center’s 2012 Digital Terrorism and Hate Report found more than 15,000 websites, social networks, forums, online games and apps that disseminated hateful content. Also in Europe, a report this month by Community Security Trust showed that the number of anti-Semitic incidents via social media in the United Kingdom grew nearly 700 percent in the past 12 months.
“Social media is becoming more and more of a problem for us if you look at anti-Semitism,” Ronald Eissens, co-founder of the Dutch anti-racism group Magenta and the International Network Against Cyber Hate (INACH), which works to counter cyber-hate and has 21 members in 20 countries, told JNS.org. “There’s a lot of it around. Prosecution is a lot harder because most social media are based firmly in the U.S.” In France, the Gayssot law of 1990 was passed to repress racist, anti-Semitic or xenophobic acts and criminalizes Holocaust denial. French Holocaust denier Robert Faurisson later claimed the law violated his right to freedom of expression and academic freedom, but the United Nations Human Rights Committee ruled against him. France punishes the dissemination of racist content online with fines and terms of imprisonment. These penalties increase if the dissemination was public—for example, on a website rather than in a private email—according to the American Jewish Committee (AJC).
“The French justice system has made a historic decision,” Jonathan Hayoun, president of the UEJF, said in a statement about the French court’s recent Twitter ruling. “It reminds victims of racism and anti-Semitism that they are not alone and that French law, which protects them, should apply everywhere, including Twitter.” France has faced off against an American online giant before. In 2000, France prosecuted Yahoo! for selling Nazi memorabilia online. In France, it is illegal to display such items unless they are in a theatrical or museum setting. A French court ruled at the time that Yahoo! had to make the auction site inaccessible to French users or pay a fine. Although it never legally accepted the French ruling, Yahoo! eventually chose to remove the auction. Then, in 2012, Twitter, Facebook and YouTube complied with German law by either taking down material posted by a neo-Nazi group or by blocking users in Germany from access to the content, according to the New York Times.
Additional broad laws have been passed on racism and cyber-hate. The Council of Europe’s Additional Protocol to the Cybercrime Convention was passed in 2003 and became enforceable in 2006 after receiving the required number of signatures. The protocol criminalized racist and xenophobic acts committed through computer systems. The European Framework Decision on Combatting Racism and Xenophobia was then passed in 2008. In 2005, the European Union Monitoring Centre on Racism and Xenophobia (EUMC)’s Working Definition of anti-Semitism was released, defining the phenomenon as “a certain perception of Jews, which may be expressed as hatred toward Jews.” “Rhetorical and physical manifestations of anti-Semitism are directed toward Jewish or non-Jewish individuals and/or their property, toward Jewish community institutions and religious facilities… More specifically, manifestations could also target the state of Israel, conceived as Jewish collectivity,” the definition reads.
Though that definition was never legally binding, various international bodies, several law enforcement agencies and European courts have used it in their investigations. It is essentially meant to “help police forces who are monitoring anti-Semitism on the ground to have a better understanding of what anti-Semitism is,” Kenneth Stern, the AJC’s specialist on anti-Semitism and extremism, told JNS.org. Under the First Amendment, hate speech in the U.S. must be likely to cause violence or harm before it can be deemed criminal. But in the European Union, speech can be prohibited even if it is only abusive, insulting or likely to disturb public order, noted Talia Naamat, legal researcher at the Kantor Center for the Study of Contemporary European Jewry in Jerusalem. There are many laws on Holocaust denial in Europe, including in Germany, Belgium, and Austria, where British Holocaust denier David Irving was convicted and imprisoned in 2006.
In Spain, Holocaust denial was a criminal violation until 2007, when a court ruled in the case of neo-Nazi activist Pedro Valera that Holocaust denial could not be punished with imprisonment because the act falls within free speech. But in January, Spain’s justice minister proposed a new bill that would make Holocaust denial a criminal offense if it incites violence. The bill is expected to be approved later this year. “I believe this case best encapsulates the debate (in Europe) between freedom of expression versus incitement to hatred, as well as the varying degrees of protection from hate speech,” Naamat told JNS.org. But frequently, such European laws appear as part of a broader “incitement to racial, ethnic or religious hatred or discrimination,” or as part of the general prohibition of genocide, she said.
In the Netherlands, the Dutch penal code includes a broad anti-discrimination provision, “ so, anti-Semitic content in essence will be prosecuted if it’s brought to the prosecutor as falling under the anti-discrimination legislation,” Eissens said. Three cases were recently filed against Jeroen de Kreek, a Dutch Holocaust denier who posted his material on several websites. Having already lost one case, he will face the other two this spring. In this case, Kreek is likely to be convicted as “his material is blatantly anti-Semitic,” according to Eissens. The UK has only general legislation regarding harassment and discrimination, The Public Order act of 1986, which states that “a person who uses threatening words or behavior, or displays any written material which is threatening, is guilty of an offence if he intends thereby to stir up religious hatred.” Other laws, the Protection from Harassment Act, the Malicious Communications Act, and the Racial and Religious Hatred Act, were passed in subsequent years.
“Our perspective is that things which are illegal offline should be illegal online,” Dave Rich, spokesman for Community Security Trust, which conducted the February survey on social media anti-Semitism in the UK, according to the International Business Times. “Racial abuse laws were made from incidents in the street, not online.” European laws on the issue, however, are not uniformly applied across the EU. Even the European Court of Human Rights does not offer an accepted definition for “hate speech,” instead offering only parameters by which prosecutors can decide if the “hate speech” is entitled to the protection of freedom of speech. Prosecutors therefore exercise a great amount of discretion, as do policemen, who must classify the act as a hate crime or not, and judges, who must assess which action or speech is likely to disturb public order. “That assessment can be subjective,” Naamat said.
INACH’s Eissens emphasized that prosecuting anti-Semitism is “a thing we do but not the only thing we do.” The organization is also highly focused on counter-speech projects, education and prevention, though Eissens does believe that the law is necessary in some extreme cases. “It’s a bit like the police arrests 20 people this week and the same people are back on the streets doing it again one month later, or they’re joined by another 40; it doesn’t mean that the police has to stop working,” he said. Stern believes that it is generally “more effective to have hateful speech marginalized than censored,” particularly by having high-ranking officials or politicians call it out. In 2005, Stern was part of a debate in the U.S. between Jewish groups on the issue of anti-Semitism online. One school of thought was in favor of removing anti-Semitic content, the other side believed such content “is a way to train kids in this new medium,” on how to distinguish hateful speech from benign speech, he said.
The bigger problem, Stern said, is when anti-Semitism, online or otherwise, is expressed as normal, polite dinner conversation. “I’ll be less worried if it’s half a dozen neo-Nazis with tattoos sitting in a bar someplace,” he said.