As learners, young and old, increasingly use the Internet for learning purposes, issues arise that require new and creative solutions. One of the most vexing is to create an environment where people feel safe, yet affords the user the openness and freedom to pursue their learning interests.
This question was at the heart of the Aspen Institute Task Force on Learning and the Internet, a project run by the Aspen Institute Communications and Society Program and supported by the John D. and Catherine T. MacArthur Foundation. For over a year the Task Force, composed of 20 top level experts and leaders from various disciplines, addressed this and other questions aimed at optimizing the opportunities for learning online with the need to assure users that their data, privacy, and safety would be secure as they moved up the learning ladder.
Recognizing that safety and privacy should be preserved, however, does not mean creating a culture of fear, or one of over-filtering protectionism. Rather, the Task Force determined that a trusted environment would best be formed by local stakeholders – parents, educators, local leaders, businesses, and even students themselves – coming together to address the principles and actions that would create trust.
The areas of inquiry and agreement suggested that key characteristics of such an environment might include:
−Transparency and Openness: Require easy-to-read disclosures to enable learners and other stakeholders to clearly understand who is participating, what the norms and protections are, what data is collected, and how it is used.
−Participation: Provide opportunities for individual and interest group participation in decision-making and policymaking related to the development and deployment of connected learning solutions.
−Data Stewardship: Find ways to protect data that may include mechanisms to reduce the risk of harm, such as clearly delimiting the permissible uses of data, de-identifying sensitive data and/or deleting data once it no longer has value for learning. Data can also be used to provide feedback about what works, thereby shortening the cycle to improve the ecosystem of learning networks.
−Technology Innovation: Create and deploy technologies that support a trusted environment, such as the use of metadata to convey and enforce data policy or privacy dashboards that indicate what information is shared with whom.
−Accountability: Adopt policies and procedures or a code of conduct that supports responsible learning environments.
−Oversight and Enforcement: Establish regulatory arrangements to protect the integrity of learning networks with competent and appropriately resourced bodies in place to enforce these principles.
But the more surprising area of protection, the Task Force found, is internal to the learner: what the group called “Digital Age Literacies” – specifically, media, digital, and social-emotional literacies. Let’s face it, if the Internet is to be open and free (in the free speech sense), and if we are to be able to access content from anywhere in the world, then authorities are not going to be able to limit what goes on to the Internet effectively.
Rather, the filtering will need to come at the reception end: from the intermediary (e.g., Internet Service Provider), via software, or by the user. For various good reasons, we do not want ISPs or governments censoring content except in extreme cases of pornography or national security. Filters have proven, thus far, to be at once too restrictive and not restrictive enough to be effective. That is, there are plenty of anecdotes of over-blocking, such as filtering out “grape” so that the learner does not see “rape” or “Bikini Atoll” or “Moby Dick.” And we know that students can often get around filters to reach the category that filters are protecting against. Neither advances the cause of learning.
The ultimate answer is for the learner to gain the digital, media, and social-emotional literacy tools to understand the medium, where the content is coming from, how to avoid unsafe activity, and how to communicate effectively online. Bullying and sexual activity are just as prevalent offline as online, though online behaviors can extend the harm. It can make the activity never-ending and extend the scope of the problem. Children today need the training to understand predatory behavior wherever it occurs, and how to handle it.
The Aspen Task Force then cites in two places the usefulness of Digital Age Literacies to keep children safe online, Action Steps T and Z. They are not the only solutions, but they are important ones that deserve strong and sustained attention.
Action T: Integrate risks related to digital life into all existing risk-prevention education programs.
Action Z: Arm learners with the capability to protect themselves online through both appropriate risk-prevention education and teaching digital, media, and social-emotional literacies.
As learners optimize their learning networks going forward, one of the safest ways to do so is to become proficient in the new digital age literacies, not only to take advantage of the tools and content available online, but as a means of keeping safe and private as well.
You can download the full “Aspen Task Force on Learning and the Internet” report here.
Charlie M. Firestone is executive director of the Aspen Institute Communications and Society Program. Since his appointment in December 1989, this Program has focused on the implications of communications and information technologies for leadership, the impact of new technologies on democratic and social institutions, and the development of new communications policy models and options for the public interest.