Documentary coverage of IGF-USA by the Imagining the Internet Center

Posts Tagged ‘Internet

IGF-USA 2012 Afternoon Plenary: Remarks from Larry Strickling

leave a comment »

Brief session description:

Thursday, July 26, 2012 – Larry Strickling, assistant secretary for communications and information and administrator of the National Telecommunications and Information Administration of the US Department of Commerce, spoke about the United States and the global Internet.

Details of the session:

Larry Strickling, assistant secretary for communications and information and administrator of the National Telecommunications and Information Administration (NTIA) of the US Department of Commerce, highlighted the importance of the multistakeholder model in his afternoon keynote talk at IGF-USA Thursday at Georgetown Law Center.

Larry Strickling speaks during closing plenary at the IGF-USA conference in Washington, D.C. on July 26, 2012.

The NTIA has long been integral in the operation of the Internet Corporation of Assigned Names and Numbers (ICANN), which regulates global domain name policy. While NTIA, on behalf of the US Department of Commerce, reached an agreement with ICANN in 2009to transition the technical coordination of the DNS to a new setting in ICANN under conditions that protect the interests of global Internet users, NTIA represents the US government on ICANN’s Governmental Advisory Committee and it is still an influential force.

Given the near-infinite reach of Internet services, Strickling emphasized the need to include more global representatives in the process of domain name regulation and the discussion of related issues.

“We have focused on enhanced cooperation and finding ways for the global Internet community to have a more direct say in matters of Internet governance,” he said. “This issue is one of great importance as we head into the World Conference on International Telecommunications (WCIT) and World Trade Forum conferences over the next year, where some countries will attempt to justify greater governmental control over the Internet.”

Strickling said the NTIA has made a concerted effort to dissolve the illusion of US control over the Internet infrastructure by showing a heightened respect for the laws of individual countries and finding new ways to address conflicts of interest.

He also expressed his support of greater transparency in all organizations involved in Internet governance, including the International Telecommunication Union, and forcefully restated that the US position on Internet governance is to appropriately limit the role of the government in policymaking.

“Those of us in the US government will work to be as inclusive and transparent as we can be,” he said. “We will push back against calls for more control. Limiting ourselves to the role of facilitator is absolutely key to the ultimate success of the (multistakeholder model). We will press ahead.”

– Katie Blunt

 

IGF-USA 2012 Afternoon Plenary: Summaries of the Day’s Sessions

leave a comment »

Brief session description:

Thursday, July 26, 2012 – The moderators and organizers of the day’s workshops, best practices and case studies sessions presented reports rounding up the key details of the day’s discussions.

Details of the session:

From big data to youth online, policy must evolve with the Internet. Nine workshops were held at today’s IGF-USA concerning topics from gTLDs and big data to cybersecurity and youth online. Moderators or organizers reported the main takeaways from each workshop or best practice forum.

Workshop: Next challenge – How to handle data in the cloud

  • The idea of big data confuses policymakers, who tend to interpret the data as numbers as opposed to actual text, image and video content. The term “big data” is often confused with politically charged rhetoric like “big government,” “big business” and “big tobacco,” joked moderator Mike Nelson, a professor at Georgetown University and research associate at CSC Leading Edge Forum.
  • The panel concurred that the European Union idea of a digital “right to be forgotten” is impractical for the majority of Internet users. Because of the interconnected nature of the Internet, erasing one user’s published content could cause a domino effect and compromise the content of other users.
  • The policies instituted thus far have created layers upon layers of future problems for policymakers, according to Nelson.

Ron Andruff speaks to conference participants during “Summary Reports from Workshops” session at the IGF-USA conference in Washington, D.C. on July 26, 2012.

Workshop: The changing landscape of the Domain Name System – New generic top level domains (gLTDs) and their implications for users

  • A multi-stakeholder model for the Internet will continue to exist but needs to be refined, moderator Ron Andruff, president and CEO of DotSport, LLC., said of the panel’s conclusions.
  • A healthy criticism of ICANN can only make it a better organization, Andruff said. Domain names may continue to seem irrelevant based on the continued use of external links and search engines, but they will always play a critical role on the Web.
  • There are “more questions than answers about the changing landscape,” Andruff said. “While we know the changes are coming, none of us know what those are.”

A scenario story: Two possible futures for copyright: anarchy or totalitarianism

  • This workshop focused on two possible extremes regarding copyright laws: total removal of all copyright law or complete enforcement of copyright online. A panel of three students discussed the economic and creative implications of each, led by organizer Pablo Molina, information systems technology professor at Georgetown University Law Center.
  • If a regime of anarchy concerning copyright laws ensued, creativity would flourish at an individual level, but companies would be hindered from investing in creativity at an organized level, Molina said. If a regime of totalitarianism concerning copyright laws developed, money would constantly be changing hands at the business level, but individual creativity would suffer for fear of persecution for the use of copyrighted materials.
  • Based on a survey of the panel’s audience, one third of the audience believed U.S. would most likely adopt an anarchic copyright policy, while two thirds believed the U.S. would favor copyright totalitarianism.

    Pablo Molina, Georgetown Law Center

  • The panel concluded laws are only one part of the copyright struggle, and other important aspects include social norms and efforts for more innovative business model solutions, Molina said.
  • “One of the principal takeaways was that communication and information flows to evolve to the point of being an essential service in international disaster response,” said panelist Garland McCoy, founder and president of the Technology Education Institute.
  • Considering that citizens are often the first at a scene of an emergency, social networking sites are transforming the way that the public becomes aware of emergency situations, McCoy said.

Workshop: Can an open Internet survive – Challenges and issues

  • Main challenges in maintaining an open Internet include protecting intellectual property, privacy and freedom of access simultaneously, according to moderator Robert Guerra, Privaterra founder.
  • Important measures required to maintain an open Internet is to embrace the multi-stakeholder effort and consider different global points of view, Guerra said.

Critical Internet Resources (CIRs): Evolution of the Internet’s technical foundations

  • In order to keep the Internet alive, it is necessary to continue to develop policies and adopt standards for the future of CIRs, said panel co-organizer Paul Brigner, the regional bureau director of the North American Bureau at the Internet Society.
  • Perception that the US government is in control of Internet governance has exponentially decreased as multiple stakeholders continue to actively participate in the policymaking process.

Workshop: Cybersecurity – Channeling the momentum at home and abroad

  • Unlike debates concerning healthcare and economic policy, panelists agree that Internet policy has not evolved to have a common understanding by the public, according to Patrick Jones, senior director of security for ICANN.
  • There is a long way to go before policymakers can fully understand the technical implications of policies when there is legislation being developed concerning Internet governance, Jones said.

Turning principles into practice, or not: Case vignettes – Internet Governance/ICANN, consumer privacy, cyber security, dialogues about lessons learned

  • Principles are good for guidance in hard legislation, but when Internet actors do not follow legislation guidelines, principles alone are not enough to govern the Web, said moderator Shane Tews of Verisign.
  • Soft laws have the flexibility to update and change as the Internet evolves, Tews said.

Youth forum: Youth in an online world – Views and perspectives of youth as users

  • The majority of college students 20 and older are most concerned about privacy online, specifically on social networking sites, said moderator Dmitry Epstein, a post-doctoral fellow at Cornell University Law School.
  • Youth’s main worries about the future of Internet policy is that they will continue to have to worry about privacy settings if they want to enjoy the free and personalized services online, Epstein said.
  • One solution Epstein proposed was a policy to make all social networking sites private by default and allow for simple ways to configure privacy, such as a switch or opt-out of personal information disclosure.

– Madison Margeson

Written by andersj

July 26, 2012 at 10:59 pm

IGF-USA 2012: Critical Internet Resources (CIRs) – Evolution of the Internet’s Technical Foundations

leave a comment »

Brief session description:

Thursday, July 26, 2012 - Since the initiation of the Internet Governance Forum (IGF), Critical Internet Resources (CIR) and the evolution of the Internet’s technical foundations have been a central focus of ongoing Internet governance debates. Varied views can engender misunderstandings that influence the opinions of global stakeholders, and different views exist about how to advance CIRs. International governmental approaches are proposed by some, while others strongly support the present bottom-up, consensus-driven models. Three foundational technological changes – IPv6, secure Domain Name System (DNSsec) and secure routing – framed the discussion in this workshop. Deployment of these new technical and organizational approaches raises significant challenges to stakeholders, operations and governance arrangements.

Details of the session:

The moderator for the session was Walda Roseman, chief operating officer of the Internet Society. Panelists included:

  • Steve Crocker, chair of the board of the Internet Corporation for Assigned Names and Numbers
  • John Curran, president and CEO of the American Registry of Internet Numbers
  • Richard Jimmerson, director for deployment and operationalization, Internet Society
  • Vernita Harris, deputy associate administrator in the Office of International Affairs of NTIA, US Department of Commerce

Thursday’s IGF-USA conference at Georgetown Law Center featured an assembled panel of government and corporate experts who addressed the controversial issues concerning the control of critical Internet resources.

Walda Roseman, chief operating officer of the Internet Society (ISOC), chaired the discussion on the implementation and security of CIRs.

CIRs include IP addresses, domain names, routing tables and telecommunications, or what Steve Crocker, CEO and co-founder of Shinkuro Inc., Internet Hall of Fame member and chair of the board of ICANN, called the base of Internet architecture upon which everything else is built.

Moving from Internet Protocol Version 4 to IPv6

One of the most pressing concerns regarding CIRs is the revision of Internet Protocol (commonly referred to as IP) from version 4 to version 6, now the most dominant protocol for Internet traffic.

IPv4 used 32-bit addresses, allowing for approximately 4.2 billion unique IP addresses, but the growth of the Internet has exceeded those limits. IPv6 uses 128-bit addresses, allowing for about 3.4×1038  unique addresses. This number is equal to approximately 4.8×1028 addresses for each of the seven billion people alive in 2012.

John Curran speaks about critical internet resources during IGF-USA conference in Washington, D.C. on July 26, 2012.

Because headers on IPv4 packets and IPv6 packets are quite different, the two protocols are not interoperable and thus they are both being run in what is called a “double stack.”

However, IPv6 is, in general, seen to be a conservative extension of IPv4. Most transport and application-layer protocols need little or no change to operate over IPv6. The exceptions to this are the application protocols that embed internet-layer addresses, such as FTP and NTPv3. In these, the new address format may cause conflicts with existing protocol syntax.

Internet service providers, the Internet Society and many large Internet-based enterprises worked to support a World IPv6 Launch on June 6 this year to help accelerate the adoption of IPv6.

John Curran, president and CEO of the American Registry for Internet Numbers, said upgrading to IPv6 is a necessary step for “any enterprise that wants to still be in business in five years,” because it enables them to continue to reach new customers and grow.

When asked about the costs or burdens of upgrading to IPv6 for small businesses, Curran explained that in most cases the burden would fall on the hosting company through which they run their website.

Chris Griffiths, director of high-speed Internet and new business engineering for Comcast, confirmed this, stating his company would have to upgrade to continue to attract new clients.

Security issues always loom large in Internet evolution

The development of the Internet has led to a need for Domain Name System Security, or DNSSEC. Curran explained that DNSSEC maintains the integrity of the Internet by ensuring the information users obtain is from the source they believe they are corresponding with, essentially preventing redirection to fraudulent websites.

Redirection could come from hackers, hijackers and phishers, but also the US government, should initiatives such as SOPA or PIPA pass.

“My primary interest is keeping the open Internet alive,” said Richard Jimmerson, director of deployment and operationalization for ISOC. “Somebody in this room will want to invent the next Facebook or Yahoo! Today, that is possible, but if we do not pay attention to certain things, that may not be possible anymore.”

Griffiths said Comcast and other Internet technology companies work together through governance processes now in place to address, for example, the types of security vulnerabilities that can drive action to work to avoid future risk, and in making adjustments in infrastructure and dealing with other emerging challenges.

Steve Crocker speaks about critical internet resources during IGF-USA conference in Washington, D.C. on July 26, 2012.

Conflicts arise over the management of CIRs

The US government currently maintains the most control globally over CIRs. This is not well received by some critics around the world, as they fear that the United States may abuse its power. Some have also proposed that they would like to see a roadmap of the Internet for the next 20 years.

Curran addressed these concerns by stating that the US government has a positive track record regarding the respectful and neutral administration of its responsibility for CIRs, mostly leaving all of the operational details to multistakeholder global governance bodies such as the Internet Engineering Task Force and ICANN, and added that roadmap would not likely be effective as there are too many unknowns moving forward.

Vernita Harris, deputy associate administrator of the National Telecommunications and Information Administration, explained that the newest Internet Assigned Numbers Authority (IANA) contract indicates it expects that ICANN and aspects of control over the Internet architecture “will be multi-stakeholder driven, addressing the concerns of all users both domestic and international.”

– Brennan McGovern

IGF-USA 2012 Workshop: Next Challenge – How to Handle Big Data in the Cloud

leave a comment »

Brief session description:

Thursday, July 26, 2012 - The dramatic reduction in the cost of computing and storage made possible by cloud computing services, the spread of easy-to-use, open-source analytic tools, and the growing availability of massive data services from governments and the private sector (e.g. Google Maps) have enabled thousands of start-ups, hackers and others to create exciting new tools for business, entertainment, government and other sectors. Government policies can help or hinder development of new databases and Big Data apps. Issues covered in this session included: 1) open government data policy; 2) Intellectual Property Rights protection; 3) IT research; 4) technology test beds; 5) education; 6) law enforcement access; and 7) privacy regulations.

Details of the session:

The moderator for the session was Mike Nelson, a professor at Georgetown University and research associate at CSC Leading Edge Forum. Panelists included:

  • Jeff Brueggeman, vice president of public policy for AT&T
  • Paul Mitchell, senior director and general manager, Microsoft TV Division
  • Lillie Coney, associate director, Electronic Privacy Information Center
  • Jules Polonetsky, director and co-chair, Future of Privacy Forum
  • John Morris, director of Internet policy, NTIA/US Department of Commerce
  • Katherine Race Brin, attorney, bureau of consumer protection, Federal Trade Commission

Mike Nelson, an Internet policy expert from Georgetown University, shared some reassuring remarks as he introduced a panel session that concentrated upon the complexities of managing what has become known as “Big Data.”

“These issues are not new,” he said. “We’ve been dealing with them for 20 or 30 years, but they are a lot more important now.”

Nelson explained in a workshop at IGF-USA Thursday at Georgetown Law Center that it’s not just about data that are big. It’s about data that are changing so quickly and need innovative tools of management.

He introduced the following questions:

Polonetsky and Brueggeman exchange stories a workshop about the Clould at IGF-USA in Washington, D.C. on July 26, 2012.

  • How will privacy concerns impact the development of large databases (or will they have any significant impact)?
  • What are the liability issues of Big Data in the cloud?
  • How do we deal with the shortage of data experts?
  • How do we handle issues concerning control and access to data?

Jeff Brueggeman, a global public policy executive with AT&T and a longtime participant in Internet governance discussion in many fora, began the conversation by addressing a few of these issues.

First, he noted the importance of working with businesses as well as policymakers to come up with tools to manage the data. He also addressed the significance of maintaining security in cloud data.

“The more data that’s being collected and retained, the more that data could be used as a target,” Brueggeman said.

Brueggeman introduced some issues for the other panelists to contest, inquiring about best practices for dealing with data sets, types of controls over what users should expect, what uses of data are legitimate without that control and international components.

Jules Polonetsky of the Future of Privacy Forum followed with a look at the long-term perspective, offering some insight about the impacts of cloud technology.

“I’ve always had a hard time getting my head around clouds,” Polonetsky said. “But the best we can do is make sure we’re a bit of a gatekeeper.”

He argued that a formalized procedure should be established for the release of private information to law enforcement officials and others seeking information. But he also elaborated on the risks of such technology, which he illustrated by telling a story about his friend, a rabbi, who watched a racy video, unaware that Facebook would automatically share the link on his Facebook page, proving how easy it is to inadvertently share online activity with the greater digital community.

Polonetsky champions the benefits of data use, but he also urges people to consider the implications of such sharing and storing of data. He said he believes there should be a debate in which people weigh the risks and benefits.

Katherine Race Brin speaks about the Clould at IGF-USA in Washington, D.C. on July 26, 2012.

Katherine Race Brin continued the conversation, citing some of her experiences dealing with these issues in her job with the Federal Trade Commission.

She said the FTC has studied the implications of cloud computing for a number of years and has also considered how, or if, a cloud is different than any other uses of data transfer in regard to privacy.

She said her work at the FTC has led her to believe that the companies that are storing data in the cloud are often in the best position to assess the risks of that data sharing.

“We’ve always said, in relation to personal data, the businesses remain accountable for the personal data of their customers,” Brin said.

She said that while the FTC holds businesses responsible, it also provides a framework to ensure consumer privacy.

Brin explained the three key aspects of this framework:

  • Privacy by design – Companies should build in privacy protection at every stage from the product development to the product implementation phases. This includes reasonable security for consumer data, limited collection and retention of such data and resonable procedures to promote data accuracy.
  • Simplified consumer choice – Companies should give consumers the option to decide what information is shared about them and with whom. This should include a “do-not-track” mechanism that would provide a simple, easy way for consumers to control the tracking of their online activities.
  • Transparency – Companies should disclose details about their collection and use of consumers’ information and provide consumers with access to the data collected about them.

Lille Coney, associate director of the Electronic Privacy Information Center, offered her insights as an expert on big data. (To see a video posted by The Economist about Big Data that is based on EPIC-supplied information, click here.)

“Governance is not easy,” Coney said. “But we do learn mechanisms for creating accountability, transparency and oversight.”

She noted that the difficulty lies in creating guidelines that have currency and legitimacy. In regard to cloud computing, Coney suggests that people are not only consumers; they themselves – or at least the sets of the private information they share – are actually products.

“Our online activity alone generates revenue, and many consumers don’t understand that,” Coney said.

She said she strongly believes in the importance of the public’s engagement in the conversation. With all these privacy concerns, Coney said the consumer cannot afford to leave it up to businesses or government.

Lille Coney speaks about the Clould at IGF-USA in Washington, D.C. on July 26, 2012.

Microsoft executive Paul Mitchell, added some perspective to the conversation in terms of how to go about tackling the issue of how to manage Big Data. “I think the big challenge here is figuring out whats’ first when we’re talking about big data,” Mitchell said, noting the overwhelming amount of data being created and databased. “What we have here is not a new problem. What we have here is a problem of scale.”

Mitchell said we can look at the separate desires of people, businesses and society, and consider a philosophy based on each group’s needs. He explained that the people-first philosophy would ensure that data that could be harmful isn’t allowed to be. The business-first philosophy would be about maximizing the potential economic return for the use of data. The society-first philosophy would optimize the value for society as a whole based on what can be done with the data.

“From an operating perspective, the challenges we face involve how to govern against these three axises,” said Mitchell. “The policymakers’ choice is how to balance the three appropriately.”

Nelson then asked a question directed at the panelists about the future of the cloud and whether there would be one cloud in an interconnected world or a world of separate clouds run by different companies.

Session moderator Nelson then asked the panelists about the future of the cloud -whether there will be one cloud in an interconnectedworld or a world of separate clouds run by different companies.

Mitchell argued that there are circumstances that will require a private set of services.

Coney expressed concern over that model. “Consumer’s control over their data in a cloud-driven environment will require the ability to move their data from Cloud A to Cloud B. Making that a reality in this environment is going to be the challenge,” she said.

Polonetsky had a slightly different viewpoint. He considered the business-platforms perspective, questioning how easy it should be to move consumers’ data.

“Yes, it is your data, but did the platform add some value to it by organizing it in a certain way?” he asked, adding that the platforms may make a legitimate contribution by organizing consumers’ data. For example, your Facebook friends belong to you, but Facebook created a platform in which you interact with and share information, photographs and other things with them.

To conclude, Nelson took a few questions from the audience and asked each panelist for a recommendation regarding the future management of Big Data. Brueggeman suggested there should be a set of commonly accepted practices for managing Big Data. Polonetsky added there should be more navigable digital data. Brin supported the strong need for transparency. Coney proposed that cloud providers and Big Data companies must show their respect for a diverse group of stakeholders. Mitchell recommended that we should all work toward greater harmony between business, personal and societal values.

– Audrey Horwitz

IGF-USA 2012 Scenario Story: Two Possible Futures for Copyright – Anarchy or Totalitarianism

leave a comment »

Brief session description:

Thursday, July 26, 2012 - The laws of copyright were introduced before the Internet, before file-sharing and before the advances in digital tools now used to create sampling, mash-ups and remixes. One example of the complex copyright conflicts faced today is “The Grey Album,” produced by DJ Danger Mouse. It gained notoriety as it challenged the intellectual property structure in place, mashing two legally protected albums in a violation of copyright law. Danger Mouse created the album strictly as a limited-edition promotional item (only 3,000 copies), but it immediately went viral and caught the ear of many people in the music industry and all over the US, making any legal cease-and-desist request technically meaningless. This example illuminates the incredibly complex and nuanced existence of copyright law in America today. This scenario exercise was aimed at exploring two divergent sides of America’s copyright future, one where regulations surrounding copyright law are lax to the point of anarchy, and the other where the regulations increase at an exponential rate, creating a totalitarian atmosphere.

Details of the session:

Moderators for the session were Ariel Leath and Kalyah Ford, graduate students at Georgetown University. Panelists included:

  • Thomas Sydnor II, senior fellow for intellectual property at the Association of Competitive Technology
  • Matthew Schruers, vice president for law and policy at the Computer & Communications Industry Association
  • Brandon Butler, director of public policy initiatives for the Association of Research Libraries

Thomas Sydnor II speaks in a workshop about the future of copyright at IGF-USA in Washington, D.C. on July 26, 2012.

This scenario exercise at IGF-USA 2012 featured a consideration of what might happen if one or the other of two extreme situations – totalitarianism or anarchy – evolved in the future. Students from Georgetown University proposed two possible futures for panelists to discuss.

Scenarios: In an anarchist 2020 scenario, panelists discussed what might happen if a high school student turned in work incorporating aspects of Ernest Hemingway’s “The Sun Also Rises.” Would a teacher be expected to treat it as an original work? In a totalitarian 2020 scenario, panelists discussed a situation in which the phrase “good morning” is owned by McDonald’s, and any online use of it would instantly set off an alarm automatically requiring that the violator of the copyright pay a $500 fine.

These two scenarios tied to copyright, according to panelists at IGF-USA Thursday at Georgetown Law Center, are highly unlikely, but they are still interesting to ponder. They discussed the potential ramifications of both extremes.

“As far as totalitarianism, if (the United States) were to fall into totalitarianism, we’d have done it a long time ago,” said Thomas Sydnor II, a research fellow at the Association for Competitive Technology. “When I take my walk with my dogs, my dogs trespass on my neighbors’ lawns, and I go and I trespass on my neighbors’ lawns to clean up what they left on my neighbors’ lawns. And yet, I do this every day and there is not the slightest chance that I will ever be sued for it, much less arrested because we all realize that, to a certain extent, part of rights is exercising a little restraint in enforcement.”

Snydor also stressed the importance of thinking about where the Internet and its users are going in the long run in terms of copyright law enforcement. “We don’t need to have perfect enforcement, but we do need better than we have now,” he said.

Thomas and Matthew share laughs during a workshop about the future of copyright at IGF-USA in Washington, D.C. on July 26, 2012.

“Whether we like it or not, it’s a much more complex copyright environment today,” said Pablo Molina, information systems technology professor at Georgetown University Law Center.

“I consider it as similar to considering the tax laws. Tax laws are so complicated, there are so many tax laws passed every term, that you need really expert tax lawyers and CPAs and other people just to figure out how to corporate or individual taxes when things get complicated, and I would argue that it is the same thing with copyright law.” He said we are likely to be moving toward more and more legislation and more and more enforcement.

Panelist Matthew Schruers of the Computer & Communications Industry Association argued that while law regulates the Internet, the impact of other vital factors must figure into decisions about the Internet as well, including markets, architecture and social norms.

Schruers predicted that even if copyright law goes in the direction of anarchy, human norms will most likely still prevent people from entirely disregarding the idea of copyright law.

He said if he were asked to predict which direction Internet regulation of intellectual property is most likely to go in the future, it will be more anarchic than it is today.

“In a low-protectionist anarchy environment, you’re likely to see more noncommercial and derivative work that is based largely on noncommercial creation,” Schruers said. “Control needs to be effective in order to produce [a totalitarian environment].”

Given the same choice regarding which direction on the totalitarian-anarchist spectrum society is most likely to go in the future, Molina said he believes society is moving in the direction of totalitarianism. Even so, he said he believes a full tilt to this extreme is unlikely. “There are always ways for people to circumvent the system,” Molina explained. “Both scenarios are possible. Whether they are likely is a different story.”

In terms of other factors important to the copyright law discussion, Molina and Schruers both said economic growth is an extremely good measure to assess when seeking balance between the extremes.

“In terms of progress, economic progress is the best metric we have,” Schruers said.

– Mary Kate Brogan

IGF-USA 2012 Best Practice Forum: ICTs for Disaster Response – How the Internet is Transforming Emergency Management

leave a comment »

Brief session description:

Thursday, July 26, 2012 - Recent man-made and natural disasters around the globe have highlighted the importance of ICTs for connecting public safety officials, coordinating response operations and keeping citizens informed. Additionally, new and emerging Internet-based tools, mobile applications and social media have transformed disaster-relief efforts, providing real-time data for first responders and empowering citizens to access and share life-saving information and locate loved ones. Enhanced situational awareness via multiple platforms offers almost instantaneous and ubiquitous information regarding implications for life and property and individuals impacted by natural or man-made risks and threats. Internet-based communication is increasingly relied upon to support disaster preparation, response and recovery. Workshop participants looked at what must be done to ensure resilient infrastructures and continuity of operations, including keeping citizens informed. Panelists were invited to share their perspectives and the lessons learned from recent disasters and to work to identify recommendations for collaboration among stakeholders in preparing for future disasters.

Details of the session:

The moderator was Joe Burton, counselor for technology and security policy in the Communications and Information Policy Office of the US State Department. Panelists were:

  • Garland T. McCoy, founder and president of the Technology Education Institute
  • Kristin Peterson, CEO and co-founder of Inveneo, an organization that provides ICTs to remote areas
  • Keith Robertory, disaster response emergency communications manager for the American Red Cross
  • Veronique Pluviose-Fenton, a congressional staffer who focuses on homeland security and disaster response
  • Tom Sullivan, chief of staff of the Federal Communications Commission

Véronique Pluviose-Fenton speaks at a workshop on ICTS for Disaster Response at IGF-USA in Washington, D.C. on July 26, 2012.

Last month, severe storms in the Northern Virginia/Washington, D.C., metro area not only knocked out Internet service, but also caused an outage of 911 Emergency Response telephone services that lasted four days.

The Best Practice Forum at IGF-USA Thursday at Georgetown Law Center featured a discussion between government and NGO representatives on how to address this type of scenario and best coordinate disaster response in the current technological era.

According to Garland McCoy, founder of the Technology Education Institute, the 911 outage highlights the flaws of the current IP-backed telephone system, which evolved from the analog, hard-wired telephone system.

“Back in the twisted copper-wire days, the power could go out but your phone would stay on,” McCoy said. But the IP phone system now has ”hub and spoke” architecture with a single point of failure, known as a Big Data facility.

Véronique Pluviose-Fenton, a congressional staffer who focuses on homeland security and disaster response, spoke on the failures of the communication system following major catastrophes such as Hurricane Katrina and the terrorist attacks of Sept. 11, 2001.

Pluviose-Fenton emphasized the importance of interoperability—the ability of networked communications systems to communicate with each other.

“We all watched live what happens when they (first responders) couldn’t communicate,” she said, referencing the chaos of the 2001 attacks on the United States, when police officers and fire fighters could not talk or relay warnings.

Keith Robertory, disaster services technology manager for the American Red Cross, said it’s possible to build an entirely interoperable network, but there are quite a few political roadblocks standing in the way. “Can you imagine if the New York police chief and fire chief are trying to agree who owns a shared network and who controls it?” Robertory asked, illustrating the difficulty of interconnectivity.

Pluviose-Fenton agreed, saying, “I still fundamentally feel that even with the advances in technology, there still is a problem with will.”

This is not just a domestic issue, as disasters in foreign countries have also put communication technology to the test. US agencies and NGOs often join the global-assistance efforts when disaster strikes elsewhere.

Kristin Peterson, CEO of Inveneo (a non-profit focused on ICTs in the developing world), discussed her role in establishing a wireless network in Haiti following the 2010 earthquake that destroyed nearly all existing communication systems in the island nation. Every aid group providing relief had its own network, from the American Red Cross to the US military.

“Within 24 hours we knew we had to put up a WiFi network,” Peterson said.

The task took several days but was a necessary step in orchestrating the global response in aiding Haitian refugees, from providing food and water to distributing shoes sent by singer Jessica Simpson.

“If you can’t communicate, you can’t coordinate your response,” Peterson said.

Tom Sullivan speaks at a workshop on ICTS for Disaster Response at IGF-USA in Washington, D.C. on July 26, 2012.

The task took several days but was a necessary step in orchestrating the global response for food and water to shoes sent by singer Jessica Simpson.

“If you can’t communicate, you can’t coordinate your response,” Peterson said.

Tom Sullivan, chief of staff of the US Federal Communications Commission, said that even Japan, a country with an extremely sophisticated communications system and other cutting-edge technology, had to depend on a backup power grid following the 2011 earthquake.

He said it is necessary for the United States to develop a strong contingency communications plan in order to be prepared for the inevitable arrival of yet another Katrina-esque catastrophe or any devastating emergency situation. Robertory elaborated on this need. He supervises American Red Cross efforts to establish emergency communications infrastructures when providing relief to victims of disasters.

He and Sullivan also emphasized the importance of citizen engagement in a field where first-response is not and never will be 100-percent reliable.

“If 911 services were bad, wouldn’t you be more likely to learn first aid and CPR?” Robertory asked. He explained that citizens should form their own personal contingency plans should communication fail in the aftermath of a disaster.

All of the panelists agreed that advances in technology provide both new opportunities and new challenges for those responsible for disaster relief.

– Brennan McGovern

IGF-USA 2012 Workshop: Can an Open Internet Survive – Challenges and Issues

with one comment

Brief session description:

Thursday, July 26, 2012 – This workshop focused on the challenges of keeping the Internet open while simultaneously maintaining a safe and secure environment for individuals, businesses and governments. Governments encounter a wide ranging set of issues and concerns that can limit an open Internet, including the cost of connectivity, spam/malware, intellectual property rights, human rights and objectionable content. Businesses often make decisions for business purposes that may contribute to closing off the Internet. Leaders in governments’ legislative branches, including the US Congress and its counterparts around the world, and business leaders do not always recognize the implications of the actions they take that might negatively influence the Internet. In addition, citizens may voluntarily but without full understanding accept moves that contribute to closing off the Internet, quietly accepting actions and decisions that affect its openness in a negative way. The session worked to identify the key characteristics of an open Internet; the global and national challenges that threaten this; the initiatives pursued to advance the open Internet; multistakeholder engagement to develop and promote an open Internet.

Details of the session:

The session was moderated by Robert Guerra, principal at Privaterra and senior advisor to Citizen Lab in the school of global affairs at the University of Toronto. Panelists were:

  • Ellen Blackler, vice president for global public policy, The Walt Disney Company
  • Thomas Gideon, technical director of the Open Technology Institute at the New America Foundation
  • Andrew McDiarmid, policy analyst at the Center for Democracy and Technology
  • Julian Sanchez, research fellow at the Cato Institute
  • Paul Diaz, director of policy for the Public Interest Registry
  • John Morris, director of Internet policy, office of policy analysis and development of the US National Telecommunications and Information Administration

Ellen Blackler participates as a panelist about challenges and issues facing an open Internet at IGF-USA in Washington, D.C. on July 26, 2012.

Between copyright infringement, intellectual property, piracy and protection of online privacy, the openness of the Internet is being threatened on all sides, according to six IGF-USA panelists, who gathered to define and assess the challenges to an open Internet Thursday at Georgetown Law Center.

“The free and open Internet oughtn’t be a free-for-all,” said Ellen Blackler, vice president for global public policy for The Walt Disney Company.

A focus on the balance between maintaining an open Internet while ensuring security and privacy and minimizing piracy has always loomed as one of the largest challenges to the future of the Internet. While members of this panel represented diverse Internet backgrounds, the all agreed that Internet policy must and will continue to evolve with challenges posed by the struggle between these often-competing values.

What is Internet openness?

The definition of an open Internet differs even within seasoned IGF attendees.

John Morris of the National Telecommunications and Information Administration (NTIA) cited the principles of Internet openness recommended by the Organisation for Economic Co-operation and Development (OECD) last year, which highlight several key characteristics, including the opportunities for both collaboration and independent work.

An open Internet allows users to operate “independently of one another, so as not to have a centralized single body to control or impose regulations,” Morris said.

The Internet policymaking process additionally needs to be open for collaboration, Morris said.

“What is it that keeps barriers low, what steps can we take to address challenges?” asked Andrew McDiarmid, policy analyst for the Center for Democracy and Technology (CDT). “It’s about learning … to keep the process more open and open to more voices.”

Though the openness of the Internet is one of the Web’s key characteristics, challenges ensue when openness trumps privacy.

“The openness principle has failed the public in privacy interest,” Blackler said.

U.S. policies directly affect those abroad

In the United States, Internet access is virtually everywhere, but the major challenge for Internet openness in many other parts of the world is online accessibility, especially in remote areas and in developing nations.

Robert Guerra acts as moderator during a workshop about challenges and issues facing an open Internet at IGF-USA in Washington, D.C. on July 26, 2012.

“Access at an affordable cost is key because then we can innovate,” said panel moderator Robert Guerra, the founder of Privaterra.

Panelists agreed that though global policies across the board on the issues tied to Internet openness are unlikely to be established due to differing cultural values and standards from country to country, cooperation on the international scale is still quite important.

“Not that I think we need to achieve one global norm about a particular issue, but we need to achieve a global level of interoperability,” Morris said.

In some countries, global Internet operability is a major issue due to government blocking and filtering–the management of what content citizens may or may not access or share. Thomas Gideon of the Open Technology Institute noted the difficulties that global policymakers face with nations that exercise a great deal of control over available content.

“A large part of what I do in my work is to defend human rights online,” Gideon said. “That’s equally fraught with the risks that those trying to speak freely in contentious and crisis regimes face.”

Paul Diaz, director of policy for the Public Interest Registry noted the challenge of governance measures working locally and globally. “What works in one environment, what may work here in the US, is not necessarily applicable in another country” he said. Ultimately, the Internet is global and therein lies the challenge.”

Piracy and copyright: What is the solution?

When discussing the widespread nature of piracy online and the difficulty in regulating it, panelists differed in their preferred approach to dealing with the challenges of intellectual and copyrighted property.

“Companies like Netflix are slowly finding ways to shift from a product to a service model,” Julian Sanchez, a research fellow at the Cato Institute, said, suggesting this as one successful choice for property owners.

Sanchez argued that the best way to discourage piracy is to create services that offer consumers a wide variety of choices and control over consumption of goods at a fair price. He said this is a better method than exclusively offering products that can be copied and shared and pirated just as easily.

Private niches online: Social networking and the cloud

With the advent of social networking and the desire to share and access personal information, the Internet includes private and targeted content, as well.

Sanchez emphasized that the structure of the Internet should be seen more as a network of people and relationships than as a technological architecture.

Facebook’s Sarah Wynn-Williams said social networking represents the “desire for people to connect and share and be open,” adding that the future of Internet policy must meet these demands and “preserve the ability of people to [share personal content online,] which is genuinely under threat.”

Panelists also noted that files shared through cloud data storage continue to be as difficult to regulate as physically shared materials. Just as the government has often largely chosen not to investigate copied CDs or cassettes that become distributed among friends, content in the cloud is as difficult to trace and regulate.

– Madison Margeson

Follow

Get every new post delivered to your Inbox.