Here's a nice controversial one for discussion — should Internet access providers filter pornographic content, requiring someone who wants access to such content to "opt in"? Or is it not appropriate for an access provider to restrict what someone can access? The BBC presents a good overview of the debate currently going on in the UK at the moment, with a range of views from feminists, open rights campaigners and others.
Some years back, under threat of regulation, mobile providers in the UK created a self-regulatory code, imposing blocking obligations — as a result of this, if you buy a SIM for a UK mobile network, you will not be able to access content deemed to be suitable only for adults, which includes, but is not limited to, pornography, unless you prove your age and have the access control removed.
An approach based on filtering on the mobile networks made sense at that time, when a mobile phone only talked to a mobile network but, more recently, it makes far less sense, at least on its own. Many mobile phones — particularly popular devices, such as the iPhone, and Android devices — contain other methods of communication, including Bluetooth and Wi-Fi, which are outside a network operator's control — restricting access on the cellular network alone is a very limited form of protection. Similarly, even within the cellular domain, not all connections can be filtered — in particular, RIM's BlackBerry devices are currently not filtered, since browsing traffic passes in an encoded form to RIM's own infrastructure for onwards transmission. (RIM are currently discussing with Ofcom how this can be fixed quickly.)
The debate at the moment has a number of different fronts — what should mobile operators be doing, what should other players (such as Facebook, Apple and so on) be doing and, importantly, what should fixed line providers be doing. It does not make sense to me for mobile providers to go forward alone.
The issue is not just a communications one, of course — the more fundamental question is whether certain content is, or is not, appropriate for children to view. But, if we decide that certain content should not be accessible by children, it would seem to follow that there should be a convenient and user-friendly method for helping parents enact these suggestions. Whether the approach should be opt-in, "active choice" (whereby the setting is chosen when the device is first powered on) or opt-out, there's a whole range of views.
My view, for what it might be worth, is that, whatever solution is chosen, it should not be down to operator discretion — the government needs to pick a solution, and then mandate it. The role of an operator is one of a technical services provider, not a moral arbiter.
What do you think? How do things work where you are?
Agreeing with you, that 'The role of an operator is one of a technical services provider, not a moral arbiter.'
ReplyDeleteRegarding filters provided by ISPs for children/pornography, in the US, the (failed) legislation attempted to pin the responsibility not on the ISP or the parents, but rather, the commercial distributor of the material.
http://en.wikipedia.org/wiki/Child_Online_Protection_Act
Interestingly, the federal legislation to keep pornography filtered from school and library internet was successful: http://www.fcc.gov/guides/childrens-internet-protection-act
as much of this 'funding based' legislation normally is, as government funded agencies and non-profits often have limited or no other options.
The EU has published a new plan to give children digital skills and tools to enable them to benefit safely from the digital world by building up the market for interactive, creative and educational content online. The EU planned actions are grouped around four main goals:
ReplyDelete• To stimulate the production of creative and educational online content for children and develop platforms which give access to age-appropriate content
• To scale up awareness raising and teaching of online safety in all EU schools to develop children's digital and media literacy and self-responsibility online
• Creating a safe environment for children where parents and children are given the tools necessary for ensuring their protection online – such as easy-to-use mechanisms to report harmful content and conduct online, transparent default age-appropriate privacy settings or user-friendly parental controls;
• Combating child sexual abuse material online by promoting research into, and use of, innovative technical solutions by police investigations.
How do you think this approach will address some of the issues raised in the current debate as an alternative to filtering and legislation?
The full text of the publication is available at http://europa.eu/rapid/pressReleasesAction.do?reference=IP/12/445&format=HTML&aged=0&language=EN&guiLanguage=en&goback=.gde_1430_member_112601797
How do you think this approach will address some of the issues raised in the current debate as an alternative to filtering and legislation?
ReplyDeleteI've yet to read the document, although it will be on my list when I'm back in the office, but my initial opinion would be that, unless there is considerable funding available, the first two points which be very difficult to achieve.
The third point, creating the safe online environment, almost invariably requires some form of filter, although there is likely to be much debate as to who should be responsible for it — the access provider, the computer supplier, the parents etc.? Providing an effective filter is no simple feat, and is certainly not without cost — is this to be another cost on access providers, or a state-funded "public interest" activity?
The fourth point is a hugely challenging one — it sounds like a recipe for a war of technical attrition, where police forces will be slow to deploy new tools, and distributors of indecent images will find new ways to distribute, in increasingly secure and untraceable manners. It's a *very* tough nut to crack, here — one might wonder whether legalising pseudo-photographs might help to reduce the number of instances of abuse?