Documentary coverage of IGF-USA by the Imagining the Internet Center

Posts Tagged ‘Internet

IGF-USA 2012 Afternoon Plenary: Remarks from Larry Strickling

leave a comment »

Brief session description:

Thursday, July 26, 2012 – Larry Strickling, assistant secretary for communications and information and administrator of the National Telecommunications and Information Administration of the US Department of Commerce, spoke about the United States and the global Internet.

Details of the session:

Larry Strickling, assistant secretary for communications and information and administrator of the National Telecommunications and Information Administration (NTIA) of the US Department of Commerce, highlighted the importance of the multistakeholder model in his afternoon keynote talk at IGF-USA Thursday at Georgetown Law Center.

Larry Strickling speaks during closing plenary at the IGF-USA conference in Washington, D.C. on July 26, 2012.

The NTIA has long been integral in the operation of the Internet Corporation of Assigned Names and Numbers (ICANN), which regulates global domain name policy. While NTIA, on behalf of the US Department of Commerce, reached an agreement with ICANN in 2009to transition the technical coordination of the DNS to a new setting in ICANN under conditions that protect the interests of global Internet users, NTIA represents the US government on ICANN’s Governmental Advisory Committee and it is still an influential force.

Given the near-infinite reach of Internet services, Strickling emphasized the need to include more global representatives in the process of domain name regulation and the discussion of related issues.

“We have focused on enhanced cooperation and finding ways for the global Internet community to have a more direct say in matters of Internet governance,” he said. “This issue is one of great importance as we head into the World Conference on International Telecommunications (WCIT) and World Trade Forum conferences over the next year, where some countries will attempt to justify greater governmental control over the Internet.”

Strickling said the NTIA has made a concerted effort to dissolve the illusion of US control over the Internet infrastructure by showing a heightened respect for the laws of individual countries and finding new ways to address conflicts of interest.

He also expressed his support of greater transparency in all organizations involved in Internet governance, including the International Telecommunication Union, and forcefully restated that the US position on Internet governance is to appropriately limit the role of the government in policymaking.

“Those of us in the US government will work to be as inclusive and transparent as we can be,” he said. “We will push back against calls for more control. Limiting ourselves to the role of facilitator is absolutely key to the ultimate success of the (multistakeholder model). We will press ahead.”

– Katie Blunt

 

IGF-USA 2012 Afternoon Plenary: Summaries of the Day’s Sessions

leave a comment »

Brief session description:

Thursday, July 26, 2012 – The moderators and organizers of the day’s workshops, best practices and case studies sessions presented reports rounding up the key details of the day’s discussions.

Details of the session:

From big data to youth online, policy must evolve with the Internet. Nine workshops were held at today’s IGF-USA concerning topics from gTLDs and big data to cybersecurity and youth online. Moderators or organizers reported the main takeaways from each workshop or best practice forum.

Workshop: Next challenge – How to handle data in the cloud

  • The idea of big data confuses policymakers, who tend to interpret the data as numbers as opposed to actual text, image and video content. The term “big data” is often confused with politically charged rhetoric like “big government,” “big business” and “big tobacco,” joked moderator Mike Nelson, a professor at Georgetown University and research associate at CSC Leading Edge Forum.
  • The panel concurred that the European Union idea of a digital “right to be forgotten” is impractical for the majority of Internet users. Because of the interconnected nature of the Internet, erasing one user’s published content could cause a domino effect and compromise the content of other users.
  • The policies instituted thus far have created layers upon layers of future problems for policymakers, according to Nelson.

Ron Andruff speaks to conference participants during “Summary Reports from Workshops” session at the IGF-USA conference in Washington, D.C. on July 26, 2012.

Workshop: The changing landscape of the Domain Name System – New generic top level domains (gLTDs) and their implications for users

  • A multi-stakeholder model for the Internet will continue to exist but needs to be refined, moderator Ron Andruff, president and CEO of DotSport, LLC., said of the panel’s conclusions.
  • A healthy criticism of ICANN can only make it a better organization, Andruff said. Domain names may continue to seem irrelevant based on the continued use of external links and search engines, but they will always play a critical role on the Web.
  • There are “more questions than answers about the changing landscape,” Andruff said. “While we know the changes are coming, none of us know what those are.”

A scenario story: Two possible futures for copyright: anarchy or totalitarianism

  • This workshop focused on two possible extremes regarding copyright laws: total removal of all copyright law or complete enforcement of copyright online. A panel of three students discussed the economic and creative implications of each, led by organizer Pablo Molina, information systems technology professor at Georgetown University Law Center.
  • If a regime of anarchy concerning copyright laws ensued, creativity would flourish at an individual level, but companies would be hindered from investing in creativity at an organized level, Molina said. If a regime of totalitarianism concerning copyright laws developed, money would constantly be changing hands at the business level, but individual creativity would suffer for fear of persecution for the use of copyrighted materials.
  • Based on a survey of the panel’s audience, one third of the audience believed U.S. would most likely adopt an anarchic copyright policy, while two thirds believed the U.S. would favor copyright totalitarianism.

    Pablo Molina, Georgetown Law Center

  • The panel concluded laws are only one part of the copyright struggle, and other important aspects include social norms and efforts for more innovative business model solutions, Molina said.
  • “One of the principal takeaways was that communication and information flows to evolve to the point of being an essential service in international disaster response,” said panelist Garland McCoy, founder and president of the Technology Education Institute.
  • Considering that citizens are often the first at a scene of an emergency, social networking sites are transforming the way that the public becomes aware of emergency situations, McCoy said.

Workshop: Can an open Internet survive – Challenges and issues

  • Main challenges in maintaining an open Internet include protecting intellectual property, privacy and freedom of access simultaneously, according to moderator Robert Guerra, Privaterra founder.
  • Important measures required to maintain an open Internet is to embrace the multi-stakeholder effort and consider different global points of view, Guerra said.

Critical Internet Resources (CIRs): Evolution of the Internet’s technical foundations

  • In order to keep the Internet alive, it is necessary to continue to develop policies and adopt standards for the future of CIRs, said panel co-organizer Paul Brigner, the regional bureau director of the North American Bureau at the Internet Society.
  • Perception that the US government is in control of Internet governance has exponentially decreased as multiple stakeholders continue to actively participate in the policymaking process.

Workshop: Cybersecurity – Channeling the momentum at home and abroad

  • Unlike debates concerning healthcare and economic policy, panelists agree that Internet policy has not evolved to have a common understanding by the public, according to Patrick Jones, senior director of security for ICANN.
  • There is a long way to go before policymakers can fully understand the technical implications of policies when there is legislation being developed concerning Internet governance, Jones said.

Turning principles into practice, or not: Case vignettes – Internet Governance/ICANN, consumer privacy, cyber security, dialogues about lessons learned

  • Principles are good for guidance in hard legislation, but when Internet actors do not follow legislation guidelines, principles alone are not enough to govern the Web, said moderator Shane Tews of Verisign.
  • Soft laws have the flexibility to update and change as the Internet evolves, Tews said.

Youth forum: Youth in an online world – Views and perspectives of youth as users

  • The majority of college students 20 and older are most concerned about privacy online, specifically on social networking sites, said moderator Dmitry Epstein, a post-doctoral fellow at Cornell University Law School.
  • Youth’s main worries about the future of Internet policy is that they will continue to have to worry about privacy settings if they want to enjoy the free and personalized services online, Epstein said.
  • One solution Epstein proposed was a policy to make all social networking sites private by default and allow for simple ways to configure privacy, such as a switch or opt-out of personal information disclosure.

– Madison Margeson

Written by andersj

July 26, 2012 at 10:59 pm

IGF-USA 2012: Critical Internet Resources (CIRs) – Evolution of the Internet’s Technical Foundations

leave a comment »

Brief session description:

Thursday, July 26, 2012 - Since the initiation of the Internet Governance Forum (IGF), Critical Internet Resources (CIR) and the evolution of the Internet’s technical foundations have been a central focus of ongoing Internet governance debates. Varied views can engender misunderstandings that influence the opinions of global stakeholders, and different views exist about how to advance CIRs. International governmental approaches are proposed by some, while others strongly support the present bottom-up, consensus-driven models. Three foundational technological changes – IPv6, secure Domain Name System (DNSsec) and secure routing – framed the discussion in this workshop. Deployment of these new technical and organizational approaches raises significant challenges to stakeholders, operations and governance arrangements.

Details of the session:

The moderator for the session was Walda Roseman, chief operating officer of the Internet Society. Panelists included:

  • Steve Crocker, chair of the board of the Internet Corporation for Assigned Names and Numbers
  • John Curran, president and CEO of the American Registry of Internet Numbers
  • Richard Jimmerson, director for deployment and operationalization, Internet Society
  • Vernita Harris, deputy associate administrator in the Office of International Affairs of NTIA, US Department of Commerce

Thursday’s IGF-USA conference at Georgetown Law Center featured an assembled panel of government and corporate experts who addressed the controversial issues concerning the control of critical Internet resources.

Walda Roseman, chief operating officer of the Internet Society (ISOC), chaired the discussion on the implementation and security of CIRs.

CIRs include IP addresses, domain names, routing tables and telecommunications, or what Steve Crocker, CEO and co-founder of Shinkuro Inc., Internet Hall of Fame member and chair of the board of ICANN, called the base of Internet architecture upon which everything else is built.

Moving from Internet Protocol Version 4 to IPv6

One of the most pressing concerns regarding CIRs is the revision of Internet Protocol (commonly referred to as IP) from version 4 to version 6, now the most dominant protocol for Internet traffic.

IPv4 used 32-bit addresses, allowing for approximately 4.2 billion unique IP addresses, but the growth of the Internet has exceeded those limits. IPv6 uses 128-bit addresses, allowing for about 3.4×1038  unique addresses. This number is equal to approximately 4.8×1028 addresses for each of the seven billion people alive in 2012.

John Curran speaks about critical internet resources during IGF-USA conference in Washington, D.C. on July 26, 2012.

Because headers on IPv4 packets and IPv6 packets are quite different, the two protocols are not interoperable and thus they are both being run in what is called a “double stack.”

However, IPv6 is, in general, seen to be a conservative extension of IPv4. Most transport and application-layer protocols need little or no change to operate over IPv6. The exceptions to this are the application protocols that embed internet-layer addresses, such as FTP and NTPv3. In these, the new address format may cause conflicts with existing protocol syntax.

Internet service providers, the Internet Society and many large Internet-based enterprises worked to support a World IPv6 Launch on June 6 this year to help accelerate the adoption of IPv6.

John Curran, president and CEO of the American Registry for Internet Numbers, said upgrading to IPv6 is a necessary step for “any enterprise that wants to still be in business in five years,” because it enables them to continue to reach new customers and grow.

When asked about the costs or burdens of upgrading to IPv6 for small businesses, Curran explained that in most cases the burden would fall on the hosting company through which they run their website.

Chris Griffiths, director of high-speed Internet and new business engineering for Comcast, confirmed this, stating his company would have to upgrade to continue to attract new clients.

Security issues always loom large in Internet evolution

The development of the Internet has led to a need for Domain Name System Security, or DNSSEC. Curran explained that DNSSEC maintains the integrity of the Internet by ensuring the information users obtain is from the source they believe they are corresponding with, essentially preventing redirection to fraudulent websites.

Redirection could come from hackers, hijackers and phishers, but also the US government, should initiatives such as SOPA or PIPA pass.

“My primary interest is keeping the open Internet alive,” said Richard Jimmerson, director of deployment and operationalization for ISOC. “Somebody in this room will want to invent the next Facebook or Yahoo! Today, that is possible, but if we do not pay attention to certain things, that may not be possible anymore.”

Griffiths said Comcast and other Internet technology companies work together through governance processes now in place to address, for example, the types of security vulnerabilities that can drive action to work to avoid future risk, and in making adjustments in infrastructure and dealing with other emerging challenges.

Steve Crocker speaks about critical internet resources during IGF-USA conference in Washington, D.C. on July 26, 2012.

Conflicts arise over the management of CIRs

The US government currently maintains the most control globally over CIRs. This is not well received by some critics around the world, as they fear that the United States may abuse its power. Some have also proposed that they would like to see a roadmap of the Internet for the next 20 years.

Curran addressed these concerns by stating that the US government has a positive track record regarding the respectful and neutral administration of its responsibility for CIRs, mostly leaving all of the operational details to multistakeholder global governance bodies such as the Internet Engineering Task Force and ICANN, and added that roadmap would not likely be effective as there are too many unknowns moving forward.

Vernita Harris, deputy associate administrator of the National Telecommunications and Information Administration, explained that the newest Internet Assigned Numbers Authority (IANA) contract indicates it expects that ICANN and aspects of control over the Internet architecture “will be multi-stakeholder driven, addressing the concerns of all users both domestic and international.”

– Brennan McGovern

IGF-USA 2012 Workshop: Next Challenge – How to Handle Big Data in the Cloud

leave a comment »

Brief session description:

Thursday, July 26, 2012 - The dramatic reduction in the cost of computing and storage made possible by cloud computing services, the spread of easy-to-use, open-source analytic tools, and the growing availability of massive data services from governments and the private sector (e.g. Google Maps) have enabled thousands of start-ups, hackers and others to create exciting new tools for business, entertainment, government and other sectors. Government policies can help or hinder development of new databases and Big Data apps. Issues covered in this session included: 1) open government data policy; 2) Intellectual Property Rights protection; 3) IT research; 4) technology test beds; 5) education; 6) law enforcement access; and 7) privacy regulations.

Details of the session:

The moderator for the session was Mike Nelson, a professor at Georgetown University and research associate at CSC Leading Edge Forum. Panelists included:

  • Jeff Brueggeman, vice president of public policy for AT&T
  • Paul Mitchell, senior director and general manager, Microsoft TV Division
  • Lillie Coney, associate director, Electronic Privacy Information Center
  • Jules Polonetsky, director and co-chair, Future of Privacy Forum
  • John Morris, director of Internet policy, NTIA/US Department of Commerce
  • Katherine Race Brin, attorney, bureau of consumer protection, Federal Trade Commission

Mike Nelson, an Internet policy expert from Georgetown University, shared some reassuring remarks as he introduced a panel session that concentrated upon the complexities of managing what has become known as “Big Data.”

“These issues are not new,” he said. “We’ve been dealing with them for 20 or 30 years, but they are a lot more important now.”

Nelson explained in a workshop at IGF-USA Thursday at Georgetown Law Center that it’s not just about data that are big. It’s about data that are changing so quickly and need innovative tools of management.

He introduced the following questions:

Polonetsky and Brueggeman exchange stories a workshop about the Clould at IGF-USA in Washington, D.C. on July 26, 2012.

  • How will privacy concerns impact the development of large databases (or will they have any significant impact)?
  • What are the liability issues of Big Data in the cloud?
  • How do we deal with the shortage of data experts?
  • How do we handle issues concerning control and access to data?

Jeff Brueggeman, a global public policy executive with AT&T and a longtime participant in Internet governance discussion in many fora, began the conversation by addressing a few of these issues.

First, he noted the importance of working with businesses as well as policymakers to come up with tools to manage the data. He also addressed the significance of maintaining security in cloud data.

“The more data that’s being collected and retained, the more that data could be used as a target,” Brueggeman said.

Brueggeman introduced some issues for the other panelists to contest, inquiring about best practices for dealing with data sets, types of controls over what users should expect, what uses of data are legitimate without that control and international components.

Jules Polonetsky of the Future of Privacy Forum followed with a look at the long-term perspective, offering some insight about the impacts of cloud technology.

“I’ve always had a hard time getting my head around clouds,” Polonetsky said. “But the best we can do is make sure we’re a bit of a gatekeeper.”

He argued that a formalized procedure should be established for the release of private information to law enforcement officials and others seeking information. But he also elaborated on the risks of such technology, which he illustrated by telling a story about his friend, a rabbi, who watched a racy video, unaware that Facebook would automatically share the link on his Facebook page, proving how easy it is to inadvertently share online activity with the greater digital community.

Polonetsky champions the benefits of data use, but he also urges people to consider the implications of such sharing and storing of data. He said he believes there should be a debate in which people weigh the risks and benefits.

Katherine Race Brin speaks about the Clould at IGF-USA in Washington, D.C. on July 26, 2012.

Katherine Race Brin continued the conversation, citing some of her experiences dealing with these issues in her job with the Federal Trade Commission.

She said the FTC has studied the implications of cloud computing for a number of years and has also considered how, or if, a cloud is different than any other uses of data transfer in regard to privacy.

She said her work at the FTC has led her to believe that the companies that are storing data in the cloud are often in the best position to assess the risks of that data sharing.

“We’ve always said, in relation to personal data, the businesses remain accountable for the personal data of their customers,” Brin said.

She said that while the FTC holds businesses responsible, it also provides a framework to ensure consumer privacy.

Brin explained the three key aspects of this framework:

  • Privacy by design – Companies should build in privacy protection at every stage from the product development to the product implementation phases. This includes reasonable security for consumer data, limited collection and retention of such data and resonable procedures to promote data accuracy.
  • Simplified consumer choice – Companies should give consumers the option to decide what information is shared about them and with whom. This should include a “do-not-track” mechanism that would provide a simple, easy way for consumers to control the tracking of their online activities.
  • Transparency – Companies should disclose details about their collection and use of consumers’ information and provide consumers with access to the data collected about them.

Lille Coney, associate director of the Electronic Privacy Information Center, offered her insights as an expert on big data. (To see a video posted by The Economist about Big Data that is based on EPIC-supplied information, click here.)

“Governance is not easy,” Coney said. “But we do learn mechanisms for creating accountability, transparency and oversight.”

She noted that the difficulty lies in creating guidelines that have currency and legitimacy. In regard to cloud computing, Coney suggests that people are not only consumers; they themselves – or at least the sets of the private information they share – are actually products.

“Our online activity alone generates revenue, and many consumers don’t understand that,” Coney said.

She said she strongly believes in the importance of the public’s engagement in the conversation. With all these privacy concerns, Coney said the consumer cannot afford to leave it up to businesses or government.

Lille Coney speaks about the Clould at IGF-USA in Washington, D.C. on July 26, 2012.

Microsoft executive Paul Mitchell, added some perspective to the conversation in terms of how to go about tackling the issue of how to manage Big Data. “I think the big challenge here is figuring out whats’ first when we’re talking about big data,” Mitchell said, noting the overwhelming amount of data being created and databased. “What we have here is not a new problem. What we have here is a problem of scale.”

Mitchell said we can look at the separate desires of people, businesses and society, and consider a philosophy based on each group’s needs. He explained that the people-first philosophy would ensure that data that could be harmful isn’t allowed to be. The business-first philosophy would be about maximizing the potential economic return for the use of data. The society-first philosophy would optimize the value for society as a whole based on what can be done with the data.

“From an operating perspective, the challenges we face involve how to govern against these three axises,” said Mitchell. “The policymakers’ choice is how to balance the three appropriately.”

Nelson then asked a question directed at the panelists about the future of the cloud and whether there would be one cloud in an interconnected world or a world of separate clouds run by different companies.

Session moderator Nelson then asked the panelists about the future of the cloud -whether there will be one cloud in an interconnectedworld or a world of separate clouds run by different companies.

Mitchell argued that there are circumstances that will require a private set of services.

Coney expressed concern over that model. “Consumer’s control over their data in a cloud-driven environment will require the ability to move their data from Cloud A to Cloud B. Making that a reality in this environment is going to be the challenge,” she said.

Polonetsky had a slightly different viewpoint. He considered the business-platforms perspective, questioning how easy it should be to move consumers’ data.

“Yes, it is your data, but did the platform add some value to it by organizing it in a certain way?” he asked, adding that the platforms may make a legitimate contribution by organizing consumers’ data. For example, your Facebook friends belong to you, but Facebook created a platform in which you interact with and share information, photographs and other things with them.

To conclude, Nelson took a few questions from the audience and asked each panelist for a recommendation regarding the future management of Big Data. Brueggeman suggested there should be a set of commonly accepted practices for managing Big Data. Polonetsky added there should be more navigable digital data. Brin supported the strong need for transparency. Coney proposed that cloud providers and Big Data companies must show their respect for a diverse group of stakeholders. Mitchell recommended that we should all work toward greater harmony between business, personal and societal values.

– Audrey Horwitz

IGF-USA 2012 Scenario Story: Two Possible Futures for Copyright – Anarchy or Totalitarianism

leave a comment »

Brief session description:

Thursday, July 26, 2012 - The laws of copyright were introduced before the Internet, before file-sharing and before the advances in digital tools now used to create sampling, mash-ups and remixes. One example of the complex copyright conflicts faced today is “The Grey Album,” produced by DJ Danger Mouse. It gained notoriety as it challenged the intellectual property structure in place, mashing two legally protected albums in a violation of copyright law. Danger Mouse created the album strictly as a limited-edition promotional item (only 3,000 copies), but it immediately went viral and caught the ear of many people in the music industry and all over the US, making any legal cease-and-desist request technically meaningless. This example illuminates the incredibly complex and nuanced existence of copyright law in America today. This scenario exercise was aimed at exploring two divergent sides of America’s copyright future, one where regulations surrounding copyright law are lax to the point of anarchy, and the other where the regulations increase at an exponential rate, creating a totalitarian atmosphere.

Details of the session:

Moderators for the session were Ariel Leath and Kalyah Ford, graduate students at Georgetown University. Panelists included:

  • Thomas Sydnor II, senior fellow for intellectual property at the Association of Competitive Technology
  • Matthew Schruers, vice president for law and policy at the Computer & Communications Industry Association
  • Brandon Butler, director of public policy initiatives for the Association of Research Libraries

Thomas Sydnor II speaks in a workshop about the future of copyright at IGF-USA in Washington, D.C. on July 26, 2012.

This scenario exercise at IGF-USA 2012 featured a consideration of what might happen if one or the other of two extreme situations – totalitarianism or anarchy – evolved in the future. Students from Georgetown University proposed two possible futures for panelists to discuss.

Scenarios: In an anarchist 2020 scenario, panelists discussed what might happen if a high school student turned in work incorporating aspects of Ernest Hemingway’s “The Sun Also Rises.” Would a teacher be expected to treat it as an original work? In a totalitarian 2020 scenario, panelists discussed a situation in which the phrase “good morning” is owned by McDonald’s, and any online use of it would instantly set off an alarm automatically requiring that the violator of the copyright pay a $500 fine.

These two scenarios tied to copyright, according to panelists at IGF-USA Thursday at Georgetown Law Center, are highly unlikely, but they are still interesting to ponder. They discussed the potential ramifications of both extremes.

“As far as totalitarianism, if (the United States) were to fall into totalitarianism, we’d have done it a long time ago,” said Thomas Sydnor II, a research fellow at the Association for Competitive Technology. “When I take my walk with my dogs, my dogs trespass on my neighbors’ lawns, and I go and I trespass on my neighbors’ lawns to clean up what they left on my neighbors’ lawns. And yet, I do this every day and there is not the slightest chance that I will ever be sued for it, much less arrested because we all realize that, to a certain extent, part of rights is exercising a little restraint in enforcement.”

Snydor also stressed the importance of thinking about where the Internet and its users are going in the long run in terms of copyright law enforcement. “We don’t need to have perfect enforcement, but we do need better than we have now,” he said.

Thomas and Matthew share laughs during a workshop about the future of copyright at IGF-USA in Washington, D.C. on July 26, 2012.

“Whether we like it or not, it’s a much more complex copyright environment today,” said Pablo Molina, information systems technology professor at Georgetown University Law Center.

“I consider it as similar to considering the tax laws. Tax laws are so complicated, there are so many tax laws passed every term, that you need really expert tax lawyers and CPAs and other people just to figure out how to corporate or individual taxes when things get complicated, and I would argue that it is the same thing with copyright law.” He said we are likely to be moving toward more and more legislation and more and more enforcement.

Panelist Matthew Schruers of the Computer & Communications Industry Association argued that while law regulates the Internet, the impact of other vital factors must figure into decisions about the Internet as well, including markets, architecture and social norms.

Schruers predicted that even if copyright law goes in the direction of anarchy, human norms will most likely still prevent people from entirely disregarding the idea of copyright law.

He said if he were asked to predict which direction Internet regulation of intellectual property is most likely to go in the future, it will be more anarchic than it is today.

“In a low-protectionist anarchy environment, you’re likely to see more noncommercial and derivative work that is based largely on noncommercial creation,” Schruers said. “Control needs to be effective in order to produce [a totalitarian environment].”

Given the same choice regarding which direction on the totalitarian-anarchist spectrum society is most likely to go in the future, Molina said he believes society is moving in the direction of totalitarianism. Even so, he said he believes a full tilt to this extreme is unlikely. “There are always ways for people to circumvent the system,” Molina explained. “Both scenarios are possible. Whether they are likely is a different story.”

In terms of other factors important to the copyright law discussion, Molina and Schruers both said economic growth is an extremely good measure to assess when seeking balance between the extremes.

“In terms of progress, economic progress is the best metric we have,” Schruers said.

– Mary Kate Brogan

IGF-USA 2012 Best Practice Forum: ICTs for Disaster Response – How the Internet is Transforming Emergency Management

leave a comment »

Brief session description:

Thursday, July 26, 2012 - Recent man-made and natural disasters around the globe have highlighted the importance of ICTs for connecting public safety officials, coordinating response operations and keeping citizens informed. Additionally, new and emerging Internet-based tools, mobile applications and social media have transformed disaster-relief efforts, providing real-time data for first responders and empowering citizens to access and share life-saving information and locate loved ones. Enhanced situational awareness via multiple platforms offers almost instantaneous and ubiquitous information regarding implications for life and property and individuals impacted by natural or man-made risks and threats. Internet-based communication is increasingly relied upon to support disaster preparation, response and recovery. Workshop participants looked at what must be done to ensure resilient infrastructures and continuity of operations, including keeping citizens informed. Panelists were invited to share their perspectives and the lessons learned from recent disasters and to work to identify recommendations for collaboration among stakeholders in preparing for future disasters.

Details of the session:

The moderator was Joe Burton, counselor for technology and security policy in the Communications and Information Policy Office of the US State Department. Panelists were:

  • Garland T. McCoy, founder and president of the Technology Education Institute
  • Kristin Peterson, CEO and co-founder of Inveneo, an organization that provides ICTs to remote areas
  • Keith Robertory, disaster response emergency communications manager for the American Red Cross
  • Veronique Pluviose-Fenton, a congressional staffer who focuses on homeland security and disaster response
  • Tom Sullivan, chief of staff of the Federal Communications Commission

Véronique Pluviose-Fenton speaks at a workshop on ICTS for Disaster Response at IGF-USA in Washington, D.C. on July 26, 2012.

Last month, severe storms in the Northern Virginia/Washington, D.C., metro area not only knocked out Internet service, but also caused an outage of 911 Emergency Response telephone services that lasted four days.

The Best Practice Forum at IGF-USA Thursday at Georgetown Law Center featured a discussion between government and NGO representatives on how to address this type of scenario and best coordinate disaster response in the current technological era.

According to Garland McCoy, founder of the Technology Education Institute, the 911 outage highlights the flaws of the current IP-backed telephone system, which evolved from the analog, hard-wired telephone system.

“Back in the twisted copper-wire days, the power could go out but your phone would stay on,” McCoy said. But the IP phone system now has ”hub and spoke” architecture with a single point of failure, known as a Big Data facility.

Véronique Pluviose-Fenton, a congressional staffer who focuses on homeland security and disaster response, spoke on the failures of the communication system following major catastrophes such as Hurricane Katrina and the terrorist attacks of Sept. 11, 2001.

Pluviose-Fenton emphasized the importance of interoperability—the ability of networked communications systems to communicate with each other.

“We all watched live what happens when they (first responders) couldn’t communicate,” she said, referencing the chaos of the 2001 attacks on the United States, when police officers and fire fighters could not talk or relay warnings.

Keith Robertory, disaster services technology manager for the American Red Cross, said it’s possible to build an entirely interoperable network, but there are quite a few political roadblocks standing in the way. “Can you imagine if the New York police chief and fire chief are trying to agree who owns a shared network and who controls it?” Robertory asked, illustrating the difficulty of interconnectivity.

Pluviose-Fenton agreed, saying, “I still fundamentally feel that even with the advances in technology, there still is a problem with will.”

This is not just a domestic issue, as disasters in foreign countries have also put communication technology to the test. US agencies and NGOs often join the global-assistance efforts when disaster strikes elsewhere.

Kristin Peterson, CEO of Inveneo (a non-profit focused on ICTs in the developing world), discussed her role in establishing a wireless network in Haiti following the 2010 earthquake that destroyed nearly all existing communication systems in the island nation. Every aid group providing relief had its own network, from the American Red Cross to the US military.

“Within 24 hours we knew we had to put up a WiFi network,” Peterson said.

The task took several days but was a necessary step in orchestrating the global response in aiding Haitian refugees, from providing food and water to distributing shoes sent by singer Jessica Simpson.

“If you can’t communicate, you can’t coordinate your response,” Peterson said.

Tom Sullivan speaks at a workshop on ICTS for Disaster Response at IGF-USA in Washington, D.C. on July 26, 2012.

The task took several days but was a necessary step in orchestrating the global response for food and water to shoes sent by singer Jessica Simpson.

“If you can’t communicate, you can’t coordinate your response,” Peterson said.

Tom Sullivan, chief of staff of the US Federal Communications Commission, said that even Japan, a country with an extremely sophisticated communications system and other cutting-edge technology, had to depend on a backup power grid following the 2011 earthquake.

He said it is necessary for the United States to develop a strong contingency communications plan in order to be prepared for the inevitable arrival of yet another Katrina-esque catastrophe or any devastating emergency situation. Robertory elaborated on this need. He supervises American Red Cross efforts to establish emergency communications infrastructures when providing relief to victims of disasters.

He and Sullivan also emphasized the importance of citizen engagement in a field where first-response is not and never will be 100-percent reliable.

“If 911 services were bad, wouldn’t you be more likely to learn first aid and CPR?” Robertory asked. He explained that citizens should form their own personal contingency plans should communication fail in the aftermath of a disaster.

All of the panelists agreed that advances in technology provide both new opportunities and new challenges for those responsible for disaster relief.

– Brennan McGovern

IGF-USA 2012 Workshop: Can an Open Internet Survive – Challenges and Issues

with one comment

Brief session description:

Thursday, July 26, 2012 – This workshop focused on the challenges of keeping the Internet open while simultaneously maintaining a safe and secure environment for individuals, businesses and governments. Governments encounter a wide ranging set of issues and concerns that can limit an open Internet, including the cost of connectivity, spam/malware, intellectual property rights, human rights and objectionable content. Businesses often make decisions for business purposes that may contribute to closing off the Internet. Leaders in governments’ legislative branches, including the US Congress and its counterparts around the world, and business leaders do not always recognize the implications of the actions they take that might negatively influence the Internet. In addition, citizens may voluntarily but without full understanding accept moves that contribute to closing off the Internet, quietly accepting actions and decisions that affect its openness in a negative way. The session worked to identify the key characteristics of an open Internet; the global and national challenges that threaten this; the initiatives pursued to advance the open Internet; multistakeholder engagement to develop and promote an open Internet.

Details of the session:

The session was moderated by Robert Guerra, principal at Privaterra and senior advisor to Citizen Lab in the school of global affairs at the University of Toronto. Panelists were:

  • Ellen Blackler, vice president for global public policy, The Walt Disney Company
  • Thomas Gideon, technical director of the Open Technology Institute at the New America Foundation
  • Andrew McDiarmid, policy analyst at the Center for Democracy and Technology
  • Julian Sanchez, research fellow at the Cato Institute
  • Paul Diaz, director of policy for the Public Interest Registry
  • John Morris, director of Internet policy, office of policy analysis and development of the US National Telecommunications and Information Administration

Ellen Blackler participates as a panelist about challenges and issues facing an open Internet at IGF-USA in Washington, D.C. on July 26, 2012.

Between copyright infringement, intellectual property, piracy and protection of online privacy, the openness of the Internet is being threatened on all sides, according to six IGF-USA panelists, who gathered to define and assess the challenges to an open Internet Thursday at Georgetown Law Center.

“The free and open Internet oughtn’t be a free-for-all,” said Ellen Blackler, vice president for global public policy for The Walt Disney Company.

A focus on the balance between maintaining an open Internet while ensuring security and privacy and minimizing piracy has always loomed as one of the largest challenges to the future of the Internet. While members of this panel represented diverse Internet backgrounds, the all agreed that Internet policy must and will continue to evolve with challenges posed by the struggle between these often-competing values.

What is Internet openness?

The definition of an open Internet differs even within seasoned IGF attendees.

John Morris of the National Telecommunications and Information Administration (NTIA) cited the principles of Internet openness recommended by the Organisation for Economic Co-operation and Development (OECD) last year, which highlight several key characteristics, including the opportunities for both collaboration and independent work.

An open Internet allows users to operate “independently of one another, so as not to have a centralized single body to control or impose regulations,” Morris said.

The Internet policymaking process additionally needs to be open for collaboration, Morris said.

“What is it that keeps barriers low, what steps can we take to address challenges?” asked Andrew McDiarmid, policy analyst for the Center for Democracy and Technology (CDT). “It’s about learning … to keep the process more open and open to more voices.”

Though the openness of the Internet is one of the Web’s key characteristics, challenges ensue when openness trumps privacy.

“The openness principle has failed the public in privacy interest,” Blackler said.

U.S. policies directly affect those abroad

In the United States, Internet access is virtually everywhere, but the major challenge for Internet openness in many other parts of the world is online accessibility, especially in remote areas and in developing nations.

Robert Guerra acts as moderator during a workshop about challenges and issues facing an open Internet at IGF-USA in Washington, D.C. on July 26, 2012.

“Access at an affordable cost is key because then we can innovate,” said panel moderator Robert Guerra, the founder of Privaterra.

Panelists agreed that though global policies across the board on the issues tied to Internet openness are unlikely to be established due to differing cultural values and standards from country to country, cooperation on the international scale is still quite important.

“Not that I think we need to achieve one global norm about a particular issue, but we need to achieve a global level of interoperability,” Morris said.

In some countries, global Internet operability is a major issue due to government blocking and filtering–the management of what content citizens may or may not access or share. Thomas Gideon of the Open Technology Institute noted the difficulties that global policymakers face with nations that exercise a great deal of control over available content.

“A large part of what I do in my work is to defend human rights online,” Gideon said. “That’s equally fraught with the risks that those trying to speak freely in contentious and crisis regimes face.”

Paul Diaz, director of policy for the Public Interest Registry noted the challenge of governance measures working locally and globally. “What works in one environment, what may work here in the US, is not necessarily applicable in another country” he said. Ultimately, the Internet is global and therein lies the challenge.”

Piracy and copyright: What is the solution?

When discussing the widespread nature of piracy online and the difficulty in regulating it, panelists differed in their preferred approach to dealing with the challenges of intellectual and copyrighted property.

“Companies like Netflix are slowly finding ways to shift from a product to a service model,” Julian Sanchez, a research fellow at the Cato Institute, said, suggesting this as one successful choice for property owners.

Sanchez argued that the best way to discourage piracy is to create services that offer consumers a wide variety of choices and control over consumption of goods at a fair price. He said this is a better method than exclusively offering products that can be copied and shared and pirated just as easily.

Private niches online: Social networking and the cloud

With the advent of social networking and the desire to share and access personal information, the Internet includes private and targeted content, as well.

Sanchez emphasized that the structure of the Internet should be seen more as a network of people and relationships than as a technological architecture.

Facebook’s Sarah Wynn-Williams said social networking represents the “desire for people to connect and share and be open,” adding that the future of Internet policy must meet these demands and “preserve the ability of people to [share personal content online,] which is genuinely under threat.”

Panelists also noted that files shared through cloud data storage continue to be as difficult to regulate as physically shared materials. Just as the government has often largely chosen not to investigate copied CDs or cassettes that become distributed among friends, content in the cloud is as difficult to trace and regulate.

– Madison Margeson

IGF-USA 2012 Workshop: The Changing Landscape of the Domain Name System – New Generic Top Level Domains (gTLDs) and Their Implications for Users

leave a comment »

Brief session description:

Thursday, July 26, 2012 - Early in 2012, ICANN launched the process to introduce vast numbers of new generic top-level domains (gTLDs) — allowing, for the first time, the customization of Internet addresses to the right of the dot. Few people understand that there are already 22 existing gTLDs and 242 country code TLDs, with a total of 233 million registered second level names across all TLDs. In the coming years, these existing TLDs will be joined by numerous new gTLDs, likely resulting in the registration of millions of new second-level domains. Some will use scripts that are unfamiliar to English speakers or readers. How exactly these new gTLDs will impact the world of users and registrants is yet to be determined. Will they add significant new registration space, cause confusion, provide some unique innovations, or, most likely all of the above to some degree? ICANN received a wide range of applications – including brand names, generic terms, and geographic and regional terms. The workshop was organized to discuss Issues and questions including: changes to how domain name registrants and users may organize and search for information online; how defensive registrations may impact existing registrants; whether ICANN gave a sufficient focus to Internationalized Domain Names; how applications from potential registries from developing countries are supported; whether fraud and abuse that exists in the existing gTLD space will migrate easily into the new ‘spaces’ or even be compounded; and how conflicts between applicants from noncommercial sector will impact the users of the Internet.

Details of the session:

The session was moderated by Ron Andruff, president and CEO of DotSport, LLC. Panelists included:

  • Laura Covington, associate general counsel for global brand and trademarks, Yahoo!
  • Bobby Flaim, supervisory special agent with the Federal Bureau of Investigation
  • Suzanne Radell, senior policy adviser, NTIA, and US Government Advisory Council representative at ICANN
  • Elisa Cooper, director of product marketing, MarkMonitor (remote participant)
  • Alan Drewsen, executive director of the International Trademark Association
  • Andrew Mack, principal and founder of AMGlobal Consulting
  • Krista Papac, chief strategy officer for ARI Registry Services

Respondents were Dan Jaffe, executive vice president for government relations of the Association of National Advertisers, and Jeff Neuman, vice president for business affairs of Neustar and Generic Names Supporting Organization councilor at ICANN.

Suzanne Radell participates as a panelist about the changing landscape of the Domain Name System at IGF-USA in Washington, D.C. on July 26, 2012.

There is a mix of concern and optimism for how the new generic top-level domains (gTLDs) will change the landscape of the Internet, but it’s certain that a new era of the Internet is coming.

A diverse panel at IGF-USA Thursday at Georgetown Law Center offering perspectives ranging from the side of brands to trademark security agreed on one thing: The introduction of new gTLDs will open the Internet up to more users, but also to more actors and cyber squatters. The panel agreed that the gTLD program will result in a tremendous amount of change, but how it will affect the landscape and whether that change is good, sparked the most discussion.

This year, there are 2.3 billion users of the Internet and 555 million websites. The numbers are staggering, considering the Internet is only about 14 years old, said moderator Ron Andruff, president and CEO of RNA Partners Inc.

There are 22 existing gTLDs – including .com, .net, .org and .edu – and 242 country code TLDs.

Elisa Cooper, director of product marketing at MarkMonitor, joined the panel remotely to give an analysis and breakdown of new gTLD application statistics.

Of 1,930 applications for a new gTLD, 652 were .Brand applications. Cooper divides the applications into three categories: brand names, community based and generic. The two flavors of generic are closed and open – the latter makes registries available to the general public with little eligibility requirements. Cooper also revealed:

  • There is a relatively low number of Internationalized Domain Names – only 116.
  • Geographically, the majority of the applications have come from North America and Europe.
  • Of the .Brand applications – which go through the standard application process – technology,
    media and financial sectors led the way.
  • The most highly contested strings were .APP, .INC, .HOME and .ART
  • The top three applicants were Donuts, Google and Amazon.

Laura Covington, who serves as chief trademark and brand counsel for Yahoo!, joined the panel from a .brand applicant company and offered a brand owner perspective. Yahoo! applied for .yahoo and .flickr

“I think there are a lot of exciting opportunities from a marketing perspective, even from a security perspective with the new gTLDs and the new .brands in particular,” Covington said. “And I also think that it’s going to have to change the face of how trademark owners, brand owners deal with their enforcement issues, how they approach protecting their marks going forward.”

Yahoo! is viewing the new gTLDs as an amazing new world and new way to reach customers, though Covington admits uncertainty toward what search engines will do once gTLDs are added to the mix of search algorithms. As a brand owner, she has concerns with how to deal with the second-level names because there will be an exponential increase in opportunity for cyber squatters.

Flaim (FBI) and Papac (ARI) participate as panelists about the changing landscape of the Domain Name System at IGF-USA in Washington, D.C. on July 26, 2012.

Bobby Flaim, FBI special agent, is primarily concerned with the pre-existing problems with domestic and international law enforcement of the Internet and how the problems may worsen as bad actors become more prevalent.

The existing system has some major problems with cyber squatting, said Jaffe, group executive vice president of ANA. He said he didn’t want to be the panel’s doomsayer, but he added that no one should assume the new gTLD program will roll out in a smooth or timely manner.

One hugely positive impact of the new gTLDs Covington sees is an influx of new voices and new participants in the multistakeholder process.

Krista Papac, general manager of ARI Registry Services, agreed.

“I do have faith in the multistakeholder model and hope that we continue to find our way through it and deal with the different issues,” Papac said.

Papac is running some of the registries for the new gTLDs and sees a lot of opportunity to create more secure environments and more opportunities from brands.

Suzanne Radell, senior policy adviser in the Office of International Affairs at NTIA and US GAC Representative, said that more people and more interest in the program will be crucial to ICANN’s evolution.

“We’ve got our fingers crossed that the benefits to consumers, to users are not outweighed by risks and costs,” Radell said. “So we’re looking very much forward to a review of the new gTLD program.”

Alan Drewsen, executive director of INTA, said he expects that the introduction of the new gTLDs will go more slowly and be less successful than hoped.

“ICANN will continue to exist, though I think it’s done everything possible to put its life in jeopardy,” Drewsen said, making the audience and panel laugh.

Andrew Mack, AMGlobal, speaks at a workshop about the changing landscape of the Domain Name System at IGF-USA in Washington, D.C. on July 26, 2012.

INTA has been critical of the process that ICANN has led over the last several years in introducing the new gTLDs.

“Given the amount of time and money that the members have invested in this process and the potential consequences that can flow from its failure, INTA will continue to work collaboratively with a lot of these constituencies to get the best possible results,” Drewsen said.

Andrew Mack, principal of AMGlobal Consulting, sees a large concentration in the global North and the English-speaking world. People in the global South won’t be able to participate in a program they don’t know exists. Seventeen gTLD applications are better than none, he said, but the number of applicants from other parts of the globa total to a paltry amount compared to highly connected regions already experiencing huge economic shifts due to the Internet. Mack said his pessimism is rooted in the fact that Africa and Asia are missing out when they could really benefit.

“And we want them to be part of our Internet,” Mack said.

There is an influx of new participants from existing participants, Neuman of Neustar noted.

The new gTLDs open up a lot of opportunities for business and marketing folks, but each person on the panel defined success in different ways.

“It’s definitely going to be an exciting time,” said Brian Winterfeldt, a partner with Steptoe & Johnson LLP. “I think we really are moving into sort of a new era of the Internet with this expansion and I think it’s going to be very exciting to see how it evolves.”

– Ashley Barnas

IGF-USA 2012 Opening Plenary Roundtable: Emerging Internet Issues – Governments or Governance?

with 2 comments

Brief session description:

Thursday, July 26, 2012 - This major session of the opening plenary of IGF-USA discussed the current state of play with various proposals ranging from the WCIT, the UN Commission on Science and Technology and Enhanced Cooperation, areas where more government may be called for from their perspective or strong improvements in “governance.” Panelists offered a range of perspectives about government and governance.

Details of the session:

The session was moderated by Marilyn Cade, the chief catalyst of IGF-USA. Panelists included:

  • Rebecca MacKinnon, the Bernard L. Schwartz Senior Fellow at the New America Foundation
  • Marc Rotenberg, president of the Electronic Privacy Information Center
  • Jacquelynn L. Ruff, vice president of International Public Policy and Regulatory Affairs for Verizon Communications
  • Paul Brigner, the regional bureau director of the North American Bureau at the Internet Society
  • John Curran, president and CEO of the American Registry for Internet Numbers
  • Kristin Peterson, co-founder and CEO of Inveneo
  • Fiona Alexander, associate administrator of the Office of International Affairs at NTIA

If there’s a keyword lying at the heart of the Internet Governance Forum it is “multistakeholder.” Key is the belief that individuals from various backgrounds—from private industry to civil society to government to academia—benefit from gathering and discussing their visions for the future, and the viability thereof. Whether they’re able to reach any consensus after gathering and discussing the issues is another matter entirely.

The 2012 IGF-USA conference, held at Georgetown Law Center in Washington, D.C., Thursday, opened with a panel showing just how diverse these individuals can be, and how varied their focus is in regard to the pressing issues facing the parties looking to influence the continued growth of the Internet.

Rebecca MacKinnon from the New American Foundation speaks at the Opening Plenary Roundtable at IGF-USA in Washington, D.C. on July 26, 2012.

Rebecca MacKinnonof the New America Foundation opened the seven-member discussion by highlighting the importance of the “digital commons,” the non-commercial backbone providing structure to a number of vital digital institutions. Because of the shared nature of this backbone, which stretches across traditional nation-state boundaries, MacKinnon said she believes the world is on the verge of a reformation of the current governing concepts, as individual states try to gain control over institutions that involve those beyond their jurisdiction.

In the modern era, MacKinnon asserted, individuals are “not just citizens of nation-states and communities, we’re citizens of the Internet.”

“We have to be informed about how power is exercised,” she continued, highlighting a need for everyone involved to play their part in shaping the direction of the Internet’s evolution.

This, in turn, circles back to not just the perceived necessity for multi-stakeholder solutions, but the lingering questions as to how those solutions are reached.

“How do we ensure that the policy-making mechanisms actually allow input from all affected stakeholders?” MacKinnon asked.

She theorized that societies are on the precipice of a “Magna Carta” moment, in which the traditional concepts that dictate the ways in which governments work will be disrupted by this multistakeholder model.

This drew some rebuttals to some degree from other members of the panel.

Fiona Alexander, associate administrator at the Department of Commerce’s National Telecommunications and Information Administration, agreed with MacKinnon that some nations may be standing at that edge, but said the Magna Carta moment isn’t to be expected of every country, or even every stakeholder taking part in current dialogue.

“They [unnamed stakeholders] have in many cases failed to live up to what’s expected of them,” she said, which leaves those advocating for multistakeholder solutions in a situation where they’re defending a model for governance under siege, fostering doubts for its efficacy.

And a large number of those stakeholders are far behind those in developed, Western countries in regard to Internet penetration.

Fiona Alexander speaks at the Opening Plenary Roundtable at IGF-USA in Washington, D.C. on July 26, 2012.

Kristin Peterson, co-founder and CEO of Inveneo, a non-profit organization dedicated to the proliferation of communications technology in the developing world, shared just how much work needs to be done in bridging the gap between dominant Internet stakeholders and those just attaining reasonable access to the Web.

“Internet access is important not just on individual level, but on a functional level, an organizational level,” she said.

Part of this is due to the remoteness of developing, rural areas, which drives up the cost of infrastructure to a counterproductive degree.

A single 1MB connection, Peterson highlighted, which would be suitable for a school or a medical clinic, costs upwards of $800 a month in Haiti. Another unnamed country that Inveneo has worked with has less than 100MB in total. And that 1MB of Internet access? It costs roughly $2,000 per month.

On the opposite end of the spectrum, far removed from countries just beginning to break down the barriers preventing them from gaining full access to the Internet, are stakeholders who, in the minds of some, will have an inordinate amount of influence over multi-stakeholder debates.

Marc Rotenberg, president of the Electronic Privacy Information Center, highlighted the influence of corporate entities as one such problem.

Comparing growing corporate influence over the Internet to “the clouds gathering at the beginning of a Batman movie,” Rotenberg warned those in attendance, “You have to pay attention when the skies darken, things are about to happen.”

One such entity, which Rotenberg accused of having an ever-growing outsized influence over the Internet, is Google, whose growing presence on the Web is the “Number-one threat to Internet freedom.”

Regardless of whether that’s the case, such problems do require a means to draw in those affected by the evolving dialogue on Internet governance.

John Curran speaks at the Opening Plenary Roundtable at IGF-USA in Washington, D.C. on July 26, 2012.

“How do we get people engaged, how do we raise a flag and pull in society, business, governments?” asked John Curran, president and CEO of the American Registry for Internet Numbers.

Curran offered perspective into the scope of the problems facing Internet stakeholders, the shape of which appears on multiple layers, with technological standards and protocols existing at the bottom layer. They require little political involvement, moving up to domain names and IP addresses, which aren’t necessarily the most hot-button social issues under debate within the halls of Congress. Nonetheless, they bring about privacy and tracking concerns, peaking with the broad, end-user experiences that draw in such general topics as intellectual property use, censorship and national security.

And, of course, given the nature of IGF, the multistakeholder model is seen as the best means to approach such problems.

Paul Brigner, the regional director of the North American Bureau at the Internet Society and Jacquelynn Ruff, vice president of international public policy and regulatory affairs for Verizon, offered insight into how new players are accepting and integrating into the multistakeholder approach.

Telecommunications firms, well aware of the dwindling demand for their traditional services in the wake of the Internet revolution, are “moving away from focusing on traditional telecommunications to Internet protocol and Internet issues,” Brigner said.

Jacquelynn Ruff speaks at the Opening Plenary Roundtable at IGF-USA in Washington, D.C. on July 26, 2012.

An issue such as the possible transition to a sending party pays structure, for example, is an issue that demands the inclusion and participation of a multitude of affected parties. Under such a regime, “You’re not free, necessarily, to innovate at low cost like you experience today,” Brigner said. “The end-to-end nature of the Internet that allows these sort of things to evolve.”

To alleviate some of the difficulty inherent in such discussions, Ruff cited the importance of enhanced cooperation, the notion of mapping past developments, current deficiencies and projecting future ambitions in a way that involves all interested parties. Emphasizing examples within UNESCO, ICANN and the Council of Europe, Ruff celebrated enhanced cooperation’s increasing rate of adoption.

The world is at “a fork in the road on the global discussion on where the future lies,” she said. And applying enhanced cooperation to the traditional multi-stakeholder methodology could be an effective means to remedy the arguments over which path to take.

That said, a plethora of stakeholders have their own interpretation and they will be seizing the opportunities granted by this IGF event and future conferences to throw their hat into the ring drawn by the opening plenary session’s panelists.

– Morgan Little

Internet Governance Forum – USA, 2011 Lee Rainie keynote: Understanding Users’ Views

leave a comment »

Brief description:

Lee Rainie, founding director of the Pew Research Center’s Internet & American Life Project, gave a morning keynote talk at IGF-USA 2011. Pew Internet is a non-profit, non-partisan “fact tank” that studies the social impact of the Internet. Since 1999 it has published more than 250 reports examining how people’s Internet uses influence their families, communities, health care, education, civic and political life and workplaces. All of this work is available for free at http://www.pewinternet.org.

Details of the session:

The grass, it appears, isn’t always greener on the other side of the fence. In the case of the American public’s perception of various Internet issues, there’s a notion that while individual digital access and experiences are progressing swimmingly, the conceptualization of the Internet as a whole is strikingly more negative.

This break in perception is dubbed “I’m OK, they’re not” by Lee Rainie, director of the Pew Internet & American Life Project. As the lead of the non-partisan, self-proclaimed “fact-tank,” focused on reports gathering information on attitudes and activities online, Rainie sits at the heart of this contradiction.

JULY 18, 2011 - Lee Rainie from PEW Internet was the keynote speaker for the Internet Governance Forum-USA 2011.

In his keynote address at the Internet Governance Forum-USA, Rainie noted the persistence of “I’m OK, they’re not” throughout all aspects of society.

“It is a pretty common phenomenon in people’s evaluations of the world,” Rainie said. “They like their own congressman, but they don’t like Congress. They appreciate the school their children attend, but they think the education system is a mess. Our findings show they think their own use of the Internet is beneficial, but they are worried that others are not doing good things online and not getting good things out of their Internet use.”

A stagnant, thriving web

Accompanying these observations, Rainie cited statistics hinting that the standard perception of the Internet has reached a user saturation point, at least domestically. The number of Internet users in the United States has remained stagnant since 2008, never moving outside of the range of 75-79 percent. Broadband access has fluctuated only between 61-66 percent. The use of basic e-commerce has remained at about 70 percent of Internet users. And the blogging community has sat around 14 percent since 2007.

That’s not to suggest that stagnation in the Internet’s domestic reach and in certain areas of the digital world speaks for the web in its entirety.

“At the same time, other metrics show growth in some online behaviors,” Rainie said, listing social networking, job searches, video use, online phone calls and online banking as areas with particularly strong growth trends.

In addition, mobile access has continued to grow, with a recent Pew poll indicating that 59 percent of Americans are connected either through a smartphone or a laptop, and 25 percent of smartphone users use their device as their primary access point to the Internet. Along with mobile growth comes the emergence of location-based services, with 6 percent of Americans online using check-in services, 9 percent allowing location awareness through social media and 32 percent of cell phone owners providing locational data in exchange for directions or recommendations.

Happiness is a warm modem

With the emergence of new digital entry points, public experiences with the Internet remain positive.

“In terms of their own use of the Internet, they don’t seem to have concerns about the way things are proceeding. To the degree that any ordinary users think about governance issues,” which Rainie admitted is probably not great, “they like what they have and they probably wouldn’t want it messed with.”

The individual, in his/her own immediate sphere, is pleased with the Internet. But there’s a paradox lurking alongside this placidity. Rainie’s keynote was, after all, titled, “I’m OK, They’re Not.” While studying American attitudes during the 2010 midterm elections, Pew found a series of inconsistencies.

A majority agreed that the Internet exposes people to a wide range of political views, but they find it difficult to disseminate those views and to discern which are true and which are false. A majority find the Internet makes it easier for them to connect to those with similar political views, but they also believe it provides a larger, and perhaps disproportionate, stage for those with radical views.

From this public belief that individual Internet experiences are positive, but can take a turn for the worse when drawn out to the whole of society, Rainie outlined three couplets of American, and arguably global, desires.

They want liberty and security. They want transparency and confidentiality. And they want free expression to be allowed to flourish, with concern paid to tolerance and civility.

The bipolarity, of sorts, is a clear implication for IGF and its hopes for multistakeholder discussion bringing about a universally-beneficial future for the Internet.

“The appeal of the Internet to most users comes from the panoply of possibilities it brings to their lives,” Rainie said, and the role of IGF is to figure out how to reconcile the duplicitous opinions of users and the multitude of stakeholders to ensure that appeal endures.

– Morgan Little

Written by andersj

July 18, 2011 at 3:10 pm

Follow

Get every new post delivered to your Inbox.