Three weeks left to submit proposals to the international Trust Challenge

October 10, 2014

Have questions about the Trust Challenge and how to apply? You heard it here (and we annotated it): Cathy Casserly and David Theo Goldberg talk about the Aspen Task Force Report Learner at the Center of a Networked World and answer questions about the Trust Challenge application process.

Paul Oh of Educator Innovators talks to Sheryl Grant about the Trust Challenge and how to apply. (Hint: teachers are learners in connected learning environments too.) Take a look at some examples of trust challenges, check out the Aspen Task Force Report, and get your inner innovator ready to apply. Looking at you lifelong learners, educators, technologists, researchers, higher ed administrators, K-12 leaders and thinkers, plus in school, out of school, including museums, libraries, schools, school districts and more.

What’s all this talk about trust? We unpacked it during our summer series on trust in connected learning environments. Don’t have four hours to watch it all? Take a crash course with this highlight reel featuring Howard Rheingold, Audrey Watters, Barry Joseph, Anne Balsamo, Jonathan Worth and other leading thinkers working in and transforming connected learning environments.

We also dug into trust with Nishant Shah, danah boyd, David Weinberger, David Steer, and Cathy Davidson who answered questions about building trust in connected learning environments and here’s what they had to say. More thought leaders are on their way, so watch for their responses to these questions and more throughout October:

  • Often when we hear terms like “student data” or “student privacy” we don’t hear them in conversation with “trust”. Do you have any thoughts on why that might be the case?

  • How are you thinking about trust in regard to connected learning?

  • What are some of the biggest challenges to engendering trust you see in connected learning?

  • Do you know of any tools, procedures, apps, and/or systems enabling or disabling trust? How are they doing this? What do these  tools, procedures, and/or systems change how learning can happen in connected learning environments?

  • What are some of the literacies you think are required for learners to  have a digital “trust literacy”?

  • Do you have a favorite method of creating an environment of trust in your own digital practice? in learning practices? What do they look like? Is this scalable in connected learning environments?

You’re invited: We’re hosting participatory webinars in October. Tweet questions with #dmltrust or raise your hand during these Trust Challenge webinars:

Thursday, October 23 @ 1pm EST
Rachel Anderson from the Data Quality Campaign talks with Akili Lee of Chicago City of Learning and Luis Mora of the Los Angeles City of Learning about both digital and human systems and issues of trust in connected learning environments. Register here.

Tuesday, October 28 @ 2pm EST
Maria Teresa Kumar of Voto Latino and co-chair of the Aspen Task Force talks to Sheryl Grant about the interoperability, access, digital literacy, and trust, and the call to action for a truly networked learning environment. Register here.

Thursday, October 30 @ 2pm EST
Connie Yowell of the MacArthur Foundation and Cathy Davidson, co-founder of HASTAC and director of CUNY’s Futures Initiative will talk about the challenges and opportunities for connected learning when human and digital systems are designed with trust at the forefront. Register here.

 


WEBINAR: Building Trust (and Data Quality) in Cities of Learning

October 9, 2014

The Trust Challenge has launched a broad, open, constructive conversation about building trust in connected learning environments. We invite you to learn more about the Trust Challenge during this interactive webinar hosted by the HASTAC/MacArthur Foundation Digital Media & Learning Competition.

Our hosts include Akili Lee from Chicago’s City of Learning, Luis Mora from Los Angeles City of Learning, and Rachel Anderson from the Data Quality Campaign.

For Trust Challenge informational webinars, submit questions to our guests by emailing dml@hri.uci.edu and including “webinar question” in the subject line.

When: Thursday, October 23 @ 10am PST / 1pm EST

  • Duration: 50 minutes
    Register at https://attendee.gotowebinar.com/register/2633109708337421570
    Advanced registration is recommended, but not required.
    Webinar will open at 12:45 EST to allow registrants time to establish access

  • Hosted By:

    • Akili Lee, co-founder and Director of Digital Strategy and Development, Digital Youth Network
    • Luis Mora, Educational Services Coordinator, Los Angeles Unified School District
    • Rachel Anderson, Associate, Policy, Analysis, and Research at Data Quality Campaign
    • Sheryl Grant, Director of Social Networking, HASTAC/MacArthur Foundation Digital Media and Learning Competition

Archived versions of this event will be available at http://dmlcompetition.net/resources/. Information about other upcoming Trust Challenge webinars will be available at http://www.dmlcompetition.net/Blog/ and announced on Twitter from @dmlcomp with #dmltrust.


WEBINAR: Opportunities for Trust in Networked Learning with Maria Teresa Kumar

October 9, 2014

The Trust Challenge has launched a broad, open, constructive conversation about building trust in connected learning environments. We invite you to join a conversation about trust and learning with Maria Teresa Kumar, Chief Executive Officer and President of Voto Latino and co-chair of the Aspen Task Force. Learn more about the Trust Challenge during this webinar hosted by the HASTAC/MacArthur Foundation Digital Media & Learning Competition.

For Trust Challenge informational webinars, submit questions to our guests by emailing dml@hri.uci.edu and including “webinar question” in the subject line.

When: Tuesday, October 28 @ 11am PST / 2pm EST

  • Duration: 50 minutes
    Register at https://attendee.gotowebinar.com/register/587039739597187584
    Advanced registration is recommended, but not required.
    Webinar will open at 1:45 EST to allow registrants time to establish access

  • Hosted By:

    • Maria Teresa Kumar, co-chair, Aspen Task Force and Chief Executive Officer and President, Voto Latino
    • Sheryl Grant, Director of Social Networking, HASTAC/MacArthur Foundation Digital Media and Learning Competition

Archived versions of this event will be available at http://dmlcompetition.net/resources/. Information about other upcoming Trust Challenge webinars will be available at http://www.dmlcompetition.net/Blog/ and announced on Twitter from @dmlcomp with #dmltrust.


WEBINAR: Trust and Learning: What’s the Link? A Conversation with Connie Yowell and Cathy Davidson

October 9, 2014

The Trust Challenge has launched a broad, open, constructive conversation about building trust in connected learning environments. We invite you to join a conversation about trust and learning with Connie Yowell of the MacArthur Foundation, and Cathy Davidson, HASTAC co-founder and Director of the Futures Initiative at CUNY. Learn more about the Trust Challenge during this webinar hosted by the HASTAC/MacArthur Foundation Digital Media & Learning Competition.

For Trust Challenge informational webinars, submit questions to our guests by emailing dml@hri.uci.edu and including “webinar question” in the subject line.

When: Thursday, October 30 @ 11am PST / 2pm EST

  • Duration: 50 minutes
    Register at https://attendee.gotowebinar.com/register/7660848726939103234
    Advanced registration is recommended, but not required.
    Webinar will open at 1:45 EST to allow registrants time to establish access

  • Hosted By:

    • Connie Yowell, Director of Education, MacArthur Foundation
    • Cathy Davidson, Distinguished Professor and Director of The Futures Initiative at CUNY; Co-Founder, HASTAC
    • Sheryl Grant, Director of Social Networking, HASTAC/MacArthur Foundation Digital Media and Learning Competition

Archived versions of this event will be available at http://dmlcompetition.net/resources/. Information about other upcoming Trust Challenge webinars will be available at http://www.dmlcompetition.net/Blog/ and announced on Twitter from @dmlcomp with #dmltrust.


Trust and the Moment of Technological Faith: interview with Nishant Shah

October 2, 2014

trust

HASTAC sent a series of questions to thought leaders about trust challenges and solutions that could enable trust across social contexts of connected learning and engagement. From September 3rd through October 31st we will be posting their responses to these questions on HASTAC.org.

The Trust Challenge: Building Trust in Connected Learning Environments

Trust, privacy, and safety are critical to learning in an open online world. How can learners exercise control over who sees and uses their data? What tools do they need to navigate, collaborate, and learn online with confidence? What solutions will foster greater civility and respect in online learning environments? How can open technical standards create more opportunities to share and collaborate online in a spirit of trust?

The Trust Challenge will award $1.2 million to institutions and organizations that tackle these questions in real-life learning contexts. More information about the Competition including rules, guidelines, and how to enter can be found on the Competition website.

INTERVIEW

Nishant Shah has a PhD in cybercultures and is a professor of Internet and Aesthetics of New Media at Leuphana University, Lueneburg, Germany. He is the co-founder of the Bangalore based Centre for Internet & Society where he was the Research Director for 6 years, and also a knowledge partner to the Dutch development non-profit Hivos, working on developing new practices of change in network societies. His current research is at the intersections of body, digital technologies, gender and sexuality, collaborative pedagogy and connected learning.

1. What about our contemporary moment makes understanding trust important?

We live in times of faith. As more and more, our technologies become transparent, they also become opaque. We work with machines that promise that What we see is what we get, but that is an empty promise. Because increasingly, as the lag time between the input, processing and display of data gets reduced, we lose control over the machinations that run in the background. The contemporary moment is a moment of the interface, where all our attention is geared towards understanding, improving and analysing the interfaces. However, these interfaces are surfaces. Interfaces hide the infrastructure. Even as we worry about better visual representations, more accurate mapping, and stronger connectivity, we are losing control of the real operations where decisions of power, control, regulation, containment and censorship reside. This is what I call the moment of faith. In order to stop having blind faith that governments, private corporations, technologies, or indeed the people that we connect with, will all behave as expected, in conditions of transparency, we need to think about trust again. Trust is faith quantified. Trust requires enumeration. Trust needs responsiveness and responsibility. And more than anything else, trust demands reciprocity. So instead of having faith, and then be constantly surprised at how different things are, we need to start thinking about trust – its processes, its mechanics and its measurement.

2. Often when we hear terms like “student data” or “student privacy” we don’t hear them in conversation with “trust”. Do you have any thoughts on why that might be the case?

I think that there is a deliberate division of intellectual discourse, when it comes to some of the most important debates around the intersections of digital technologies and learning. The questions around data, for instance, are divided into two discrete sets. The educators and learners are invited to engage with concerns around privacy, identity, robustness and authenticity of data whereas questions of trust, security, licensing and storage are often relegated to the realms of the technologists and designers. This division is not only erroneous but downright dangerous, because it makes people believe that they don’t need to worry about the other concerns – because somebody else is taking care of it. This is why concepts like ‘trust’ become important. In order to talk about the design of trust, we will need to now straddle these divisions and think of them as not only co-existent but also inextricably tied to each other. We will have to acknowledge that questions of data identification, identity, quantification and ownership are tied closely to the digital architecture, conditions of access, protocols of design and ownership of platforms. It opens up a dialogue between the artificially created silos of technology development and content production, or technical architecture and political control, making these into technosocial questions rather than technical or social questions.

3. How are you thinking about trust in regard to connected learning?

For me, what is most important in the landscape of connected learning is to map the bottlenecks of trust. If we were to go with the metaphor of the network, and connections as intersections, then finding out the flows of trust, the traffics, the infrastructures, the nodes and hubs and routes that trust processes and data takes, is what is most important. Especially because connected learning seeks to overturn the systems of authority which traditionally ensured and safeguarded trust practices, it becomes important to see how different communities of learners interact with each other, but also with the service providers, regulators, policy actors, communities of support and of infrastructure. So when I think of trust in connected learning, I am more interested in thinking about how protocols of trust can be established – how it can be measured and effectively reported, and how it can be infused with the affective, the human, the subjective and the personal, rather than just interface and reporting fixes.

4. What are some of the biggest challenges to engendering trust you see in connected learning?

I like to think of opportunities, rather than challenges, because challenge presumes that there is an active resistance against developing trust in connected learning environments and promises The active resistance is easier to overcome, because it only needs education, training and information. However, the opportunities are going to be in the more complex questions that emerge in connected learning.

The first one set of opportunities is in recognising our education and learning processes as shaped by technologies. When it comes to connected learning, there is an easy argument of novelty that presumes that this is the first time our learning is intersecting with technologies. However, even the most cursory critical history will teach us that the entire modern education system has been shaped by technologies of information production, storage, and distribution. This schism between people who make apps for learning and people who engage in the process of teaching and learning has to be removed. And that is going to take more than just practice. It is going to require a common vocabulary and a dialogue that allows the different stakeholders to actually understand each other’s processes and modes of working, and maybe even engineer hands-on immersion into the work. To build connected learning, we might need to first connect the different elements involved in the field and get them to trust each other.

Given how connected learning is not restricted to the traditional learning environments, the second set of opportunities is going to be shaped around what is at stake. It is easy to think of fixes and platforms and designs and databases as modes of bringing together connected learning ideas. However, we need to find a common grounds, a political vision a set of values that we embody as we work through new partnerships, open collaborations and participatory processes in open learning. It is one thing to operationalize trust through processes and practices, but it is going to take more effort and resources in figuring out that at the core of our different approaches is a common set of ideas and ideologies that bring us all together.

The third set of opportunities, are in dismantling the notion of trust itself. There is already a growing rhetoric of niceness, inclusion, respect and generosity that is often used to actually penalise radical ideas or those who refuse to subscribe to one narrative of power and politics. We need to make sure that we think of trust not as a finite thing, but as a continuously iterative process which will find contradictions and discrepancies, not only on the outside, but from within the community. What constitutes trust, and how do we ensure that trust does not become a monopoly or a grand narrative, and constantly allows for mistrust and scepticism, instead of developing a faith in trust.

5. Do you know of any tools, procedures, apps, and/or systems enabling or disabling trust? How are they doing this? What do these  tools, procedures, and/or systems change how learning can happen in connected learning environments?

I am hoping that this competition actually brings to our attention some of the most creative ways by which trust can be enabled, and once enabled, sustained. There are many different existing protocols that are useful, in making sure that trust is a part of a system, and they range from UI design to architecture to human intervention. For instance, verification certificates, checking of URLs for malicious code or redirects, enforcing https logins, etc. are great ways by which access is made safe and encourages people to perform complex processes like financial transaction or medical data transfer. Similarly, collaborative databases that verify the provenance of information, self-editing and corrective algorithms that help provide better information sources, and user-generated curation of information that enables new insights and access to the online information, are all ways by which an environment of trust is created. Human intervention, where community conflicts get mitigated, offensive material gets flagged, trolls get punished, and new people are encouraged to participate are ways by which trust gets generated. All of these are elements that we need to be able to incorporate in our connected learning environments, along with the traditionally known structures of creating safe and inclusive spaces for learners to create, innovate and experiment with knowledge content and processes.


Settings for Trust in Connected Learning

September 25, 2014

settings-265131_1280-1

HASTAC sent a series of questions to thought leaders about trust challenges and solutions that could enable trust across social contexts of connected learning and engagement. From September 3rd through October 31st we will be posting their responses to these questions on HASTAC.org.

The Trust Challenge: Building Trust in Connected Learning Environments

Trust, privacy, and safety are critical to learning in an open online world. How can learners exercise control over who sees and uses their data? What tools do they need to navigate, collaborate, and learn online with confidence? What solutions will foster greater civility and respect in online learning environments? How can open technical standards create more opportunities to share and collaborate online in a spirit of trust?

The Trust Challenge will award $1.2 million to institutions and organizations that tackle these questions in real-life learning contexts. More information about the Competition including rules, guidelines, and how to enter can be found on the Competition website.

INTERVIEW

danah boyd is a Principal Researcher at Microsoft Research and a Fellow at Harvard’s Berkman Center for Internet and Society. She is also the founder and president of a new think/do tank called the Data & Society Research Institute. Her research examines the intersection of technology and society. Currently, she’s focused on research questions related to “big data”, privacy and publicity, and teen culture. Her recent book – “It’s Complicated: The Social Lives of Networked Teens” – has received widespread praise from scholars, parents, and journalists.  Blog: http://www.zephoria.org/thoughts and Twitter: @zephoria

1. What about our contemporary moment makes understanding trust important?

The public projects a lot onto technology. It is seen as both the savior of our current economy and the destroyer of our cultural fabric.  The companies and organizations that build a lot of these systems are perfectly aware of how imperfect they are, but many people assume technology to be perfect and infallible (or outright evil).  To complicate matters further, the organizations that are building or employing new technologies are rarely local or connected deeply to the communities that use them.  As a result, a whole host of questions about trust emerge.  How do we understand the technologies? The companies that build them? The organizations that deploy them? The parties that abuse them? Given our general fear and misunderstanding of technology, this gets complicated very fast.
2. How are you thinking about trust in regard to connected learning?
When we talk about connected learning, we’re implicating a whole host of different actors to enable learning – educators, parents, students, librarians, administrators, government agencies, technologists, learning companies, etc.  We need those varied actors to understand, respect, and trust one another.  And then we need them to help bake trust into the systems that they build – technological, social, and governmental.  At a technological level, trust requires security, privacy, and safety sitting at the center of the story.  These things take on a different valence when we’re talking about social and governmental decision-making.  But trust starts from collectively recognizing that we’re all working towards a desirable goal of empowering learners and realizing that getting there will be imperfect and require iteration.
3. What are some of the biggest challenges to engendering trust you see in connected learning?
Distrust. <grin>  More seriously, I do think that there’s a lot of distrust between different actors in the network. Some of this comes from historical battles, but there is also genuine fear and concern about what new technologies and disruption writ large mean for those who have spent their lives in education. The other core issue is that people’s failure to understand technology’s strengths and weaknesses mean that the public often has unreasonable expectations regarding technology and its application.  This is not helped by industry actors who are happy to sell the moon without accounting for the limitations of what various tools can or cannot promise.
4. Do you know of any tools, procedures, apps, and/or systems enabling or disabling trust? How are they doing this? What do these  tools, procedures, and/or systems change how learning can happen in connected learning environments?
Advancements in this arena happen at multiple levels. For example, encryption can be a powerful tool for enhancing privacy and security.  Public commitments and correction procedures – such as those made by Wikipedia – can go a long way in building trust over time, even when people doubt the service at the beginning.  Publicly detailed data management plans, such as those required by many federal grants, can be a great mechanism for assessing the efforts of a particular endeavor. The most important thing to remember is that no system is perfect, no procedure infallible.  So a huge part of the process of building and sustaining trust is to plan for what happens when things go wrong.  We do this all the time in education – think about fire drills – but we don’t realize how important this is when we think about technology.
5. What are some of the literacies you think are required for learners to  have a digital “trust literacy”?

I think that people need to understand how data is collected, aggregated, sold, and used in the process of enabling all sorts of everyday services.  Why do you think you got the results you got on Google? How did Amazon decide to recommend that other product to you? Why are you seeing the ads you’re seeing on your local newspaper’s site? What happens when you Like something? The more that people can understand how data operates in a networked society, the more that we can have a meaningful conversation about trust.  And the more that people can start asking questions of the services they are using in order to hold those services accountable.


Reflections on Trust and Learning with an Evolving Internet

September 18, 2014

mozilla-advocacy_logo-only

HASTAC sent a series of questions to thought leaders about trust challenges and solutions that could enable trust across social contexts of connected learning and engagement. From September 3rd through October 31st we will be posting their responses to these questions on HASTAC.org.

The Trust Challenge: Building Trust in Connected Learning Environments

Trust, privacy, and safety are critical to learning in an open online world. How can learners exercise control over who sees and uses their data? What tools do they need to navigate, collaborate, and learn online with confidence? What solutions will foster greater civility and respect in online learning environments? How can open technical standards create more opportunities to share and collaborate online in a spirit of trust?

The Trust Challenge will award $1.2 million to institutions and organizations that tackle these questions in real-life learning contexts. More information about the Competition including rules, guidelines, and how to enter can be found on the Competition website.

INTERVIEW

Dave Steer is Mozilla Foundation’s director of policy and advocacy, where he shapes the organization’s public policy position and develops programs that enable web users to have a voice in advancing and protecting the free and open web.

Dave joined Mozilla in 2014 from Facebook, where he was responsible for the company’s global policy programs in a variety of areas including teen safety, education, digital citizenship, jobs and economy, and veterans affairs. Prior to Facebook, he held leadership positions at Common Sense Media and GreatSchools.org, and ran Trust & Safety marketing at eBay and PayPal. Steer started his career as part of the initial team at TRUSTe, where he was responsible for marketing and public relations for the privacy program.

Steer holds a B.A. in political science from the University of Vermont. He serves on the Bay Area advisory board for Little Kids Rock, is an avid Phish fan, and dreams of touring in a band when he grows up. He lives in San Francisco, CA, with his wife and daughter.

1. What about our contemporary moment makes understanding trust important?

We are at a critical point in the evolution of the Internet.

We’re seeing first-hand that new technologies are enabling people all over the world to connect and share the most important parts of their lives. They are fueling innovation and opportunity, and leading to new jobs, stronger economies, and more resilient communities. They are creating new opportunities to improve academic, social and emotional outcomes through learning inside and outside the classroom, enabling educators to create whole communities that put the learner at the center.

More broadly, we’re seeing the explosive adoption of the Internet. Today, billions of people are online and, led by the massive growth in mobile usage, billions more from the most rural parts of the globe will be online in the next few years.

The same factors that are fueling this growth so dramatically, however, can work against us. Indeed, the Internet is a fragile resource and its stability is hinged on trust in the medium.

People’s comfort with privacy, security and safety online is what drives this trust. Their comfort is shaped by many factors they read in the news and experience in their online lives: governments undermining the security of the Internet to advance surveillance practices; policy makers threatening to destroy the level playing field of the Internet; companies tracking people’s online activities; identity thieves attacking the Internet to steal sensitive information. All of this leads to an environment of distrust which creating barriers to leveraging the Internet to advance society as a whole.

This presents us with unique challenges and opportunities.

The Internet has the potential to be the greatest shared, global resource and medium in the history of the world, accessible to and shaped by all people. It has the promise to be the first medium in which anyone can make anything, and share it with anyone.

In order for this to become a reality, particularly during this time of ever-accelerating growth, we have to codify the way in which we can earn trust and enable a vigilant, successful global community on the Internet.

2. Often when we hear terms like “student data” or “student privacy” we don’t hear them in conversation with “trust”. Do you have any thoughts on why that might be the case?

Privacy is at the core of trust.

Consider this: Over the past decade, people have been asked to share more and more information with websites and mobile services. This raises many questions: What information is being gathered? Who has access to it? Why is it being gathered? And, when it comes to Connected Learning, are educators trained with the skills they need to handle sensitive personal information?

But trust and privacy is not limited to student data collection and use. It also includes social networking, which raises even more societal questions when it comes to learning and the interaction between students, families and educators. How can educators create a safe environment in the school when so much connection among students is happening online? How can it create a culture of kindness, where bullying and cruelty are not OK? And in an age where students and educators are both exposing parts of their lives online, what constitutes a reasonable expectation of privacy?

While these open questions can raise doubt and uncertainty, there is also great opportunity in enabling young people to better develop their identities through a deeper understand of what is private. Specifically, privacy allows people to play around with identities, which is particularly important at formative stages of life. These stages often coincide with formal education – without that space to try on different identities, people are locked into a single way of being.

In this way, the notion of privacy is subjective and contextual, and young people today are learning about privacy while developing their identity and exploring different contexts, both online and offline. danah boyd highlights the opportunities and dangers of this ‘context collapse’ in her book ‘It’s Complicated’.

3. What are some of the biggest challenges to engendering trust you see in connected learning?

Some of the biggest challenges to engendering trust are rooted in privacy. That is, what sensitive personal information is being gathered by the technologies that educators are using? How are they using the information, and how are they safeguarding it? And are educators trained to handle the information? We need to create a better understanding among students, families and all stakeholders about who ‘owns’ the data, how the data is being used, and people’s rights regarding the data. A key part of developing this understanding will be transparency – that is, making the answers to these pivotal privacy questions transparent to all parties.

Safety is another core challenge to trusting a Connected Learning experience. With fear over bullying and other issues related to how students interact with each other, educators and families question the risks associated with an increasing role of connected technologies in the learning experience. This is just one reason why social-emotional learning development must be part of any educator’s approach to Connected Learning.

The technologies associated with Connected Learning are new and evolving all the time. As a result, there will be a knee-jerk tendency from policy makers to develop rules around how these technologies are used in order to maintain privacy and safety. Without understanding the technology or how it will be used, this type of policy making approach can undermine the promise of connected learning.

4. Do you know of any tools, procedures, apps, and/or systems enabling or disabling trust? How are they doing this? What do these tools, procedures, and/or systems change how learning can happen in connected learning environments?

At the core of enabling trust in Connected Learning is greater understanding how the Internet works.

Today, there is an upswell of programs aimed at teaching people coding skills. These programs serve as a tool to prepare people for an information economy-driven workforce.

This is important, but I encourage people to start by engaging in systems that enable people to build trust through creating web literacy. For example, Mozilla’s Webmaker tools are easy — and FUN — ways to learn how the Web works.

A good guide to navigating the various elements of web literacy is the Web Literacy Map and related resources. These resources map to core web literacy skills, that will be vital both to trust in Connected Learning and competencies in an evolving economy. Another set of emerging tools are badging systems, such as Open Badges, that give people recognition for accomplishments. This type of incentive system inspires people to continue learning and ‘level up’.

5. Do you see a distinction between the structural conditions and experiential considerations regarding trust? If so, what are the sorts of structural/institutional structures that might engender/discourage trust in relation to learning/connected learning?

Trust is so critical to how a thriving society functions that it must be intentionally designed and architected into its fabric. Call it ’Trust by Design’.

For example, since openness and transparency are vital drivers of trust, the architects designing a connected learning experience must include points at which they demystify and shine the light on their practices. In Connected Learning, people want to understand what information they are giving, who they are giving it to, what will be done with the information, and why it is needed. Beyond this, they want to understand the value exchange associated with Connected Learning — specifically, does new technology result in better outcomes, whether it be academic, social, civic, or workforce related.

Another ‘trust by design’ element must take into account the notion that trust is an interpersonal, social dynamic. It is the dynamic that says ‘I rely on you. So you can rely on me’. In this spirit, healthy systems that engender trust enable the community to actively participate in the making of the system. Wikipedia, for example, is such a powerful engine for knowledge and learning because it is developed, maintained and safeguarded by the community.

The most successful Connected Learning systems will keep this community development and participation dynamic at the core of what they design.


Scaling Trust: How We can Make Trust Part of Old, Traditional Systems

September 11, 2014

social-371648

HASTAC sent a series of questions to thought leaders about trust challenges and solutions that could enable trust across social contexts of connected learning and engagement. From September 3rd through October 31st we will be posting their responses to these questions on HASTAC.org.

The Trust Challenge: Building Trust in Connected Learning Environments

Trust, privacy, and safety are critical to learning in an open online world. How can learners exercise control over who sees and uses their data? What tools do they need to navigate, collaborate, and learn online with confidence? What solutions will foster greater civility and respect in online learning environments? How can open technical standards create more opportunities to share and collaborate online in a spirit of trust?

The Trust Challenge will award $1.2 million to institutions and organizations that tackle these questions in real-life learning contexts. More information about the Competition including rules, guidelines, and how to enter can be found on the Competition website.

INTERVIEW

David Weinberger, Ph.D., writes about how the Internet is shaping our most fundamental understanding of ourselves. In books including “The Cluetrain Manifesto” (co-author), “Small Pieces Loosely Joined,” “Everything Is Miscellaneous,” and “Too Big to Know,” he has explored the implications of the Internet for marketing, journalism, business strategy, information organization, knowledge, politics, science, and much more. A senior researcher at Harvard’s Berkman Center for Internet & Society, he has been a Franklin Fellow at the US State Dept., an entrepreneur and marketing consultant to high tech companies, co-director of Harvard’s Library Innovation Lab, and an advisor to presidential campaigns. Dr. Weinberger’s doctorate is in philosophy from the University of Toronto.

1. What about our contemporary moment makes understanding trust important?

Trust is established through a complex set of social interactions and markers. The Net is a radically new social world where the interactions and the markers are different because the Net is an open space where anyone can participate; each person brings her own community’s expectations and norms. It is therefore very easy to misinterpret the markers that establish trust.

But, the Internet is also, of course, an enormous opportunity for new social engagement. We need to face the issue of trust or else we’ll lose that opportunity.

2. Often when we hear terms like “student data” or “student privacy” we don’t hear them in conversation with “trust”. Do you have any thoughts on why that might be the case?

Trust is one of the implicit factors in any discussion of privacy. Sometimes it’s ignored because people are focusing on the technological side of privacy. That’s fine so long as the discussion is truly about the technology. For example, if people are talking about the efficacy of encrypting packets via SSL as they travel across the Internet, the question of trust is implicit and can remain so. But if the conversation thinks that SSL will solve the privacy problem, then the question of trust needs to be made explicit. Once the packets arrive at their destination, we have to ask if we trust the recipients to keep them safe.

3. How are you thinking about trust in regard to connected learning?

Trust is obviously a crucial part of any learning relationship. The learners and their teachers (who may be one in the same) have to trust one another as sources of information and, more importantly, as committed to the mutual learning process. Without this we won’t trust what we’ve learned, which is exactly the same as not having learned.

In most real-world structured learning environments — a classroom, a mentoring session — we trust the institution and we come to trust our teacher or co-learner. We come to this trust through well-established norms and markers: Is the school accredited? Does this person seem to have my interests at heart? Are my teachers and co-learners people of integrity and character? Online, those markers are not nearly as well established. The relationships are often less rich, and cultural differences can wreak havoc if they are not recognized.

4. What are some of the literacies you think are required for learners to have a digital “trust literacy”?

The main issue is that the old systems for validating sources don’t scale. There aren’t enough editors, librarians, and curators to handle the never-ending waves of ideas and information on the Net. We have to use methods that do scale, and we have to accept a higher risk that we’ll misplace our trust. (There were prices we paid with the old regime as well: a more homogenous group decided what was worth our attention and trust, and what they didn’t let through the gates became inaccessible.)

So, among the literacies now required:

  • How to evaluate a web site’s legitimacy and trustworthiness. This also requires understanding the “rules of evidence” within a domain. For example, what counts as evidence in science is different from what counts as evidence in a law court or in literary criticism.

  • Old fashioned informal logic to spot fallacies and good arguments.

  • Crucially, learning how to use social media to evaluate the trustworthiness of sources of information — including how not to be fooled by the abusers of social media and by the structural weaknesses inherent in any system of legitimization.

  • Humility: Recognizing that understanding always occurs within a context of settled beliefs but all systems of settled beliefs reflect cultural, historical, and linguistic biases. So, we have to re-double our efforts to reach beyond our own cultural milieus (or “echo chambers” as they’re sometimes misleadingly called).

5. Do you have a favorite method of creating an environment of trust in your own digital practice? in learning practices? What do they look like? Is this scalable to/FOR connected learning? Why or why not?

Modesty and humility go a long way: acknowledging early on that there are things we as individuals don’t know or understand, and looking to others to join in the investigation.

 


Building Trust in Connected Learning Environments Interview: Cathy N. Davidson

September 4, 2014

nobody-314697_1280

HASTAC sent a series of questions to thought leaders about trust challenges and solutions that could enable trust across social contexts of connected learning and engagement. From September 3rd through October 31st we will be posting their responses to these questions on HASTAC.org.

The Trust Challenge: Building Trust in Connected Learning Environments

Trust, privacy, and safety are critical to learning in an open online world. How can learners exercise control over who sees and uses their data? What tools do they need to navigate, collaborate, and learn online with confidence? What solutions will foster greater civility and respect in online learning environments? How can open technical standards create more opportunities to share and collaborate online in a spirit of trust?

The Trust Challenge will award $1.2 million to institutions and organizations that tackle these questions in real-life learning contexts. More information about the Competition including rules, guidelines, and how to enter can be found on the Competition website.

Interview

Cathy Davidson is a distinguished scholar of the history of technology and appointed in 2011 to the National Council on the Humanities by President Obama, is a leading innovator of new ideas and methods for learning and professional development–in school, in the workplace, and in everyday life.  She is a frequent speaker and consultant on institutional change at universities, corporations, non-profits and other organizations, and writes for the Harvard Business Review, Wall Street Journal, Fast Company, The Chronicle of Higher Education, The Washington Post, Times Higher Ed, as well as many other academic and trade publications in the U.S. and abroad.

Davidson moved to the Graduate Center, The City University of New York, on July 1, 2014.  She   holds the position of Distinguished Professor and Director of The Futures Initiative, a new program designed to train the next generation of college professors and catalyze and draw upon the abundant energies and ideas of CUNY faculty and students for innovative leadership in higher education.

Interview

1. What about our contemporary moment makes understanding trust important?

Why this is important now is because everyone is paying attention. The trust issues haven’t really changed. We should have–as individuals and institutions and a society—been concerned about our security, privacy, and identity online since the beginning of the Internet.  Certainly this was a concern to those who developed the internet early on. However, in recent months we have all become urgently, personally aware of public violations of trust: everything from the massive retail credit card security breach to colleges and medical centers having student data leaked to hackers to Edward Snowden’s revelations about NSA spying on private citizens.

2. Often when we hear terms like “student data” or “student privacy” we don’t hear them in conversation with “trust”. Do you have any thoughts on why that might be the case?

More than ever before, we are aware of the relationship between privacy, security, and identity. You cannot expect people to trust your network if you are not conscientiously working to earn their trust. This competition gives learning institution’s the opportunity to reconsider their systems. It also gives them an opportunity to inform the public about these issues, contributing to all our digital literacy and therefore to all our safety.

3. How are you thinking about trust in regard to connected learning?

Whenever my students put anything online, in the classroom or out of it, I want them to be aware of the nature of the data they are sharing as well as the “persona” of themselves that they are making available to anyone with an internet connection. I put a lot of emphasis in my teaching on “curating” an identity, creating an online identity that represents their best public aspirations. That is a digital literacy, of course. One has to learn, in this historical moment, the difference between private and public in a new way. That, too, is part of trust.

4. What are some of the biggest challenges to engendering trust you see in connected learning?

Ignorance is one issue. People often trust online sites that are not trustworthy. So one challenge is making people aware that they need to ask why people are requesting their data, for what purpose, at what cost? Care is another. Sometimes our private data is exposed to the unscrupulous because organizations themselves are naïve about the difficulties of security and of protecting those who have placed trust in them.   Finally, we have to be aware, in a democracy, that free speech must be protected even as we must learn to be kinder and more considerate of one another. Trolls are as big an impediment to a trustworthy environment as government or corporate spies.

5. Do you know of any tools, procedures, apps, and/or systems enabling or disabling trust? How are they doing this? What do these  tools, procedures, and/or systems change how learning can happen in connected learning environments?

There are a vast array of tools for learning that are applicable to the Trust Challenge. For example, verification systems are very useful for private, confidential data. They need to be better, more user-friendly so that more of us use them in our everyday lives. If huge corporations such as Google offer us verification systems that are not interoperable (that, for example, work for my cell phone but not for my iPad), then I won’t use them and they might as well not exist. Other tools, such as Mozilla’s “private browsing” settings, allow us to surf the web without having others be able to track your browsing history for their purposes (not ours).  In the end, it is crucial to understand the problem and address it within that specific situation.  That is why I am excited that this Trust Challenge offers institutions the opportunity to survey their own vulnerabilities and then propose better ways of protected those learning on their tools.


Informational Webinars: Applying to the Trust Challenge

September 3, 2014

The Trust Challenge has launched a broad, open, constructive conversation about building trust in connected learning environments. We invite you to learn more about the Trust Challenge during a series of interactive webinars hosted by the HASTAC/MacArthur Foundation Digital Media & Learning Competition.

During Trust Challenge informational webinars, hosts will address questions about the application process. Questions can be submitted in advance by emailing dml@hri.uci.edu and including “webinar question” in the subject line.

Trust Challenge Informational Webinars

When: Tuesday, September 9 @ 11am PST / 2pm EST

  • Duration: 50 minutes
  • Register at https://attendee.gotowebinar.com/register/6906006506679823105
  • Advanced registration is recommended, but not required.
  • Webinar will open at 1:45 EST to allow registrants time to establish access
  • Hosted By:
    • David Theo Goldberg, Executive Director, University of California-Irvine Humanities Research Institute
    • Sheryl Grant, Director of Social Networking, HASTAC/MacArthur Foundation Digital Media and Learning Competition;

When: Thursday, October 30 @ 11am PST / 2pm EST

  • Duration: 50 minutes
  • Register at https://attendee.gotowebinar.com/register/7660848726939103234
  • Advanced registration is recommended, but not required.
  • Webinar will open at 1:45 EST to allow registrants time to establish access
  • Hosted By:
    • Connie Yowell, Director of Education, MacArthur Foundation
    • David Theo Goldberg, Executive Director, University of California-Irvine Humanities Research Institute
    • Sheryl Grant, Director of Social Networking, HASTAC/MacArthur Foundation Digital Media and Learning Competition;

Archived versions of this event will be available at http://dmlcompetition.net/resources/. Information about other upcoming Trust Challenge webinars will be available at http://www.dmlcompetition.net/Blog/ and announced on Twitter from @dmlcomp with #dmltrust.