Posts Tagged ‘AT&T’
IGF-USA 2012 Afternoon Plenary Discussion: Defining the Future for Internet Governance – Meeting Evolving Challenges
Brief session description:
Thursday, July 26, 2012 – This major session of the opening plenary of IGF-USA discussed the current state of play with various proposals ranging from the WCIT, the UN Commission on Science and Technology and Enhanced Cooperation, areas where more government may be called for from their perspective or strong improvements in “governance.” Panelists offered a range of perspectives about government and governance.
Featured participants in this special session included Jeff Brueggeman, vice president for public policy, AT&T; Chris Wolf, partner and Internet law expert from Hogans Lovells; Danny Weitzner, Office of Science and Technology Policy, The White House.
Details of the session:
As Chris Wolf of Hogans Lovells said, the ghosts of Internet past, present and future were part of the final plenary discussion on “Defining the Future for Internet Governance: Meeting Evolving Challenges” at IGF-USA Thursday at Georgetown Law Center.
Wolf dubbed himself the “the past guy” and remembered a time he was considered a pioneer in knowledge of the Internet and how it was evolving in its early years. The trio of panelists defined the future for Internet governance and the evolving challenges citizens face.
There’s been an enormous amount of growth and development during the Internet’s short life, noted Jeff Brueggeman, vice president of public policy at AT&T.
“I think the true strength of the IGF, as we talk about every year, is its ability to self-improve,” he said. “And, for all of us, from a bottom-up way, to help innovate and change the process each and every year.”
IGF introduces new topics and builds on those addressed the year before. The IGF is not just a “talk shop” that meets once a year, Brueggeman added.
IGF needs to keep broadening the participation and the process, including peers in more developing countries. More remote participation and adding numbers has been a success in the meetings, Brueggeman said. The discussion needs to keep evolving at IGF-USA and on a global basis. Pressure is growing to show that it doesn’t have the same discussion year after year.
Brueggeman said those involved in IGF should do a better job of capturing the impact of the multi-stakeholder process and show the value of it to those who don’t come to the meetings and those who will never come.
Sustainability is a real challenge, though he has seen an enormous amount of progress. A few years ago, organizers and attendees were debating whether there would be an IGF the following year. Now, they debate what to build around the one-day conference.
Danny Weitzner of The White House Office of Science and Technology Policy – Wolf called him the “ghost of Internet future” – highlighted three things that are already happening.
“We are at the middle of a multi-stakeholder explosion and the question is how to actually help make sure it’s directed and productive and doing the right things,” Weitzner said,
The second thing: He said Vint Cerf has eloquently pointed out that the Internet is now being actively used by more than 2 billion of the 7 billion people in the world, adding: “Attending to that is going to tremendously important in the future.”
And his final point: “We are in an era of just inevitable and irresistible transparency. Sometimes even governments, sometimes companies, sometimes even civil society groups take refuge in un-transparent un-institutional activities because it’s often easier, it’s often safer. But I think we’re learning over and over again in a variety of different institutions that we’ve got to learn to embrace transparency, we’ve got to learn to make it work for us and that resisting it is a mistake.”
Top-down rule-making does not always lead to innovative solutions. The Internet keys into collective intelligence and is best served by the multistakeholder model of governance, Weitzner said.
Although there are many important issues to address as the Internet evolves, Weitzner said he thinks the real discussion that’s going on is to make sure the Internet’s open environment can raise its accessibility to move from 2 billion users to 7 billion.
— Ashley Barnas
Brief session description:
Thursday, July 26, 2012 – The dramatic reduction in the cost of computing and storage made possible by cloud computing services, the spread of easy-to-use, open-source analytic tools, and the growing availability of massive data services from governments and the private sector (e.g. Google Maps) have enabled thousands of start-ups, hackers and others to create exciting new tools for business, entertainment, government and other sectors. Government policies can help or hinder development of new databases and Big Data apps. Issues covered in this session included: 1) open government data policy; 2) Intellectual Property Rights protection; 3) IT research; 4) technology test beds; 5) education; 6) law enforcement access; and 7) privacy regulations.
Details of the session:
The moderator for the session was Mike Nelson, a professor at Georgetown University and research associate at CSC Leading Edge Forum. Panelists included:
- Jeff Brueggeman, vice president of public policy for AT&T
- Paul Mitchell, senior director and general manager, Microsoft TV Division
- Lillie Coney, associate director, Electronic Privacy Information Center
- Jules Polonetsky, director and co-chair, Future of Privacy Forum
- John Morris, director of Internet policy, NTIA/US Department of Commerce
- Katherine Race Brin, attorney, bureau of consumer protection, Federal Trade Commission
Mike Nelson, an Internet policy expert from Georgetown University, shared some reassuring remarks as he introduced a panel session that concentrated upon the complexities of managing what has become known as “Big Data.”
“These issues are not new,” he said. “We’ve been dealing with them for 20 or 30 years, but they are a lot more important now.”
Nelson explained in a workshop at IGF-USA Thursday at Georgetown Law Center that it’s not just about data that are big. It’s about data that are changing so quickly and need innovative tools of management.
He introduced the following questions:
- How will privacy concerns impact the development of large databases (or will they have any significant impact)?
- What are the liability issues of Big Data in the cloud?
- How do we deal with the shortage of data experts?
- How do we handle issues concerning control and access to data?
Jeff Brueggeman, a global public policy executive with AT&T and a longtime participant in Internet governance discussion in many fora, began the conversation by addressing a few of these issues.
First, he noted the importance of working with businesses as well as policymakers to come up with tools to manage the data. He also addressed the significance of maintaining security in cloud data.
“The more data that’s being collected and retained, the more that data could be used as a target,” Brueggeman said.
Brueggeman introduced some issues for the other panelists to contest, inquiring about best practices for dealing with data sets, types of controls over what users should expect, what uses of data are legitimate without that control and international components.
Jules Polonetsky of the Future of Privacy Forum followed with a look at the long-term perspective, offering some insight about the impacts of cloud technology.
“I’ve always had a hard time getting my head around clouds,” Polonetsky said. “But the best we can do is make sure we’re a bit of a gatekeeper.”
He argued that a formalized procedure should be established for the release of private information to law enforcement officials and others seeking information. But he also elaborated on the risks of such technology, which he illustrated by telling a story about his friend, a rabbi, who watched a racy video, unaware that Facebook would automatically share the link on his Facebook page, proving how easy it is to inadvertently share online activity with the greater digital community.
Polonetsky champions the benefits of data use, but he also urges people to consider the implications of such sharing and storing of data. He said he believes there should be a debate in which people weigh the risks and benefits.
Katherine Race Brin continued the conversation, citing some of her experiences dealing with these issues in her job with the Federal Trade Commission.
She said the FTC has studied the implications of cloud computing for a number of years and has also considered how, or if, a cloud is different than any other uses of data transfer in regard to privacy.
She said her work at the FTC has led her to believe that the companies that are storing data in the cloud are often in the best position to assess the risks of that data sharing.
“We’ve always said, in relation to personal data, the businesses remain accountable for the personal data of their customers,” Brin said.
She said that while the FTC holds businesses responsible, it also provides a framework to ensure consumer privacy.
Brin explained the three key aspects of this framework:
- Privacy by design – Companies should build in privacy protection at every stage from the product development to the product implementation phases. This includes reasonable security for consumer data, limited collection and retention of such data and resonable procedures to promote data accuracy.
- Simplified consumer choice – Companies should give consumers the option to decide what information is shared about them and with whom. This should include a “do-not-track” mechanism that would provide a simple, easy way for consumers to control the tracking of their online activities.
- Transparency – Companies should disclose details about their collection and use of consumers’ information and provide consumers with access to the data collected about them.
Lille Coney, associate director of the Electronic Privacy Information Center, offered her insights as an expert on big data. (To see a video posted by The Economist about Big Data that is based on EPIC-supplied information, click here.)
“Governance is not easy,” Coney said. “But we do learn mechanisms for creating accountability, transparency and oversight.”
She noted that the difficulty lies in creating guidelines that have currency and legitimacy. In regard to cloud computing, Coney suggests that people are not only consumers; they themselves – or at least the sets of the private information they share – are actually products.
“Our online activity alone generates revenue, and many consumers don’t understand that,” Coney said.
She said she strongly believes in the importance of the public’s engagement in the conversation. With all these privacy concerns, Coney said the consumer cannot afford to leave it up to businesses or government.
Microsoft executive Paul Mitchell, added some perspective to the conversation in terms of how to go about tackling the issue of how to manage Big Data. “I think the big challenge here is figuring out whats’ first when we’re talking about big data,” Mitchell said, noting the overwhelming amount of data being created and databased. “What we have here is not a new problem. What we have here is a problem of scale.”
Mitchell said we can look at the separate desires of people, businesses and society, and consider a philosophy based on each group’s needs. He explained that the people-first philosophy would ensure that data that could be harmful isn’t allowed to be. The business-first philosophy would be about maximizing the potential economic return for the use of data. The society-first philosophy would optimize the value for society as a whole based on what can be done with the data.
“From an operating perspective, the challenges we face involve how to govern against these three axises,” said Mitchell. “The policymakers’ choice is how to balance the three appropriately.”
Nelson then asked a question directed at the panelists about the future of the cloud and whether there would be one cloud in an interconnected world or a world of separate clouds run by different companies.
Session moderator Nelson then asked the panelists about the future of the cloud -whether there will be one cloud in an interconnectedworld or a world of separate clouds run by different companies.
Mitchell argued that there are circumstances that will require a private set of services.
Coney expressed concern over that model. “Consumer’s control over their data in a cloud-driven environment will require the ability to move their data from Cloud A to Cloud B. Making that a reality in this environment is going to be the challenge,” she said.
Polonetsky had a slightly different viewpoint. He considered the business-platforms perspective, questioning how easy it should be to move consumers’ data.
“Yes, it is your data, but did the platform add some value to it by organizing it in a certain way?” he asked, adding that the platforms may make a legitimate contribution by organizing consumers’ data. For example, your Facebook friends belong to you, but Facebook created a platform in which you interact with and share information, photographs and other things with them.
To conclude, Nelson took a few questions from the audience and asked each panelist for a recommendation regarding the future management of Big Data. Brueggeman suggested there should be a set of commonly accepted practices for managing Big Data. Polonetsky added there should be more navigable digital data. Brin supported the strong need for transparency. Coney proposed that cloud providers and Big Data companies must show their respect for a diverse group of stakeholders. Mitchell recommended that we should all work toward greater harmony between business, personal and societal values.
— Audrey Horwitz
This session delved into recently announced policy statements with future implications including those made by the Organisation for Economic Cooperation and Development, the U.S. International Strategy on Cyberspace, the G8 and Others – Are principles a feasible approach to underpin Internet governance? If so, which ones? Should principles be applied by codification in law, MOU, or treaty? The workshop consisted of a mini analysis of currently proposed sets of principles. Because the Internet and online services are global, the perspective of the workshop was a global view.
Details of the session:
This is a placeholder for a lead sentence that encapsulates a key point that is the lead according to panelists in a workshop on Internet principles at the IGF-USA conference July 18 in Washington, D.C.
The co-moderators for the session were Fiona Alexander of the National Telecommunications and Information Administration (NTIA) and Shane Tews of Verisign. They hosted a session in which the following people first presented briefings on recently announced sets of principles.
Heather Shaw, vice president for ICT policy for the United States Council for International Business (USCIB), shared details of the OECD Communique on Principles for Internet Policy-Making: http://www.oecd.org/dataoecd/40/21/48289796.pdf.<
Chris Hemmerlein, a telecommunications policy analyst for NTIA, spoke about the sections of the May 2011 G8 Declaration that focus on the Internet: http://www.g20-g8.com/g8-g20/g8/english/live/news/renewed-commitment-for-freedom-and-democracy.1314.html.
Sheila Flynn, of the cyber policy office of the U.S. State Department, briefed participants on the U.S. International Strategy on Cyberspace: http://www.whitehouse.gov/sites/default/files/rss_viewer/internationalstrategy_cyberspace.pdf.
Leslie Martinkovics, director of international public policy and regulatory affairs for Verizon, introduced the concepts of the Brazilian Principles for the Internet: http://einclusion.hu/2010-04-17/internet-principles-in-brazil/.
Sarah Labowitz, U.S. State Department, shared details of the Council of Europe’s Internet Governance Principles: http://www.coe.int/t/dghl/standardsetting/media-dataprotection/conf-internet-freedom/Internet%20Governance%20Principles.pdf.
The introduction of the principles was followed by a roundtable discussion moderated by Iren Borissova of Verisign. Participants were:
- Jackie Ruff, vice president for international public policy and regulatory affairs for Verizon Communications
- Milton Mueller, Syracuse University (participating over the Internet from a remote location)
- Jeff Brueggeman, vice president for public policy at AT&T
- Cynthia Wong, director of the Project on Global Internet Freedom at the Center for Democracy & Technology
- Liesyl Franz, vice president for security and global public policy for TechAmerica
- Mike Nelson, research associate for CSC Leading Edge Forum and visiting professor at Georgetown University
- Robert Guerra, director of the Internet Freedom program at Freedom House
- Susan Morgan, executive director of the Global Network Initiative
For all of the Internet-focused principles laid out by the OECD, G8, U.S. State Department and the Brazilian government, the lists of tenants and guidelines, the debate at the 2011 Internet Governance Forum on “A Plethora of Policy Principles” boiled down to one question: Can the principles be successfully converted into actionable concepts?
Governmental parties, whether they are sanctioned by presidential administrations or are the result of a multistakeholder process, are seeking to list the boundaries in which they wish to act when the next contentious issue hits the web. The problem with these lists, which by themselves could perhaps act effectively within a singular cultural, regional or governmental context, stretch across all boundaries in a way similar to that of the Internet itself.
The policy principles included in the discussion, which in no way represent the entirety of idealized lists, were as follows:
-The OECD Communique on Principles for Internet Policy-Making, which is the most recent set, agreed upon by 34 member states, that seeks to promote the free flow of information, promote the open nature of the Internet, promote investment and the cross-border delivery of services, encourage multistakeholder cooperation and a litany of others, ranging from security concerns to liability issues for an affront to any of the contained principles.
-The G8 Renewed Commitment to Freedom and Democracy, which isn’t solely focused on Internet rights issues, but nonetheless deals heavily with digital issues. The list segments Internet users into three groups: citizens, who seek to use the Internet as a resource and as a means to exercise human rights; businesses, which use it to increase efficiency and reach consumers; and governments seeking to improve their services and better reach their citizens. The G8 list also considers the Internet as the “public forum” of our time, with all of the associated assembly rights applied.
-President Barack Obama’s U.S. International Strategy for Cyberspace focused on the concepts of prosperity, transparency and openness. It represents an effort on the part of the U.S. government to approach Internet issues with a singular vision and seeks to establish an international framework to deal with these issues in the future. Interestingly, it was also the only list of principles discussed during the session that asserts a sort of “digital right to self-defense” in the instance of an attack on the United States’ own digital resources.
-The Brazilian Internet Streering Committee’s Principles for the Governance and Use of the Internet in Brazil differed from the other lists in that it was created after a series of discussions between interested governmental, NGO, private and scientific parties. The committee’s principles also stood for greater universality to the Internet, particularly a breakdown of linguistic barriers and a strict adherence to maintaining diversity in the digital domain. For those questioning why Brazil, given the sheer number of countries with vested interests in Internet issues, Leslie Martinkovics, the director of international public policy and regulatory affairs for Verizon, said, “Brazil is seen as an opinions leader in the Americas. … they would like to take the high ground and lead the discussions going forward.”
-The Council of Europe’s Internet Governance Principles is the product of 47 member states with an expressed focus of “affirming the applicability of existing human rights on the Internet,” according to Sarah Labowitz of the U.S. State Department. In addition to those concerns, the principles call for a clear series of planning, notification and coping mechanisms in place in the event of a cyber disaster.
Once the particulars and intricacies of the various plans had been laid out, the critiques began to fly in. Mike Nelson, research associate for CSC Leading Edge Forum and visiting professor at Georgetown University, played the self-admitted role of the skeptic.
“The first thing you do is hold a meeting, and we’ve been doing that for five years,” Nelson said, describing how meetings lead to research, research leads to a lengthy span of time, during which the public becomes discontented, after which a list of principles emerges to placate the masses.
Nelson did not seek for the topic of discussion to be “do you or do you not stand for freedom,” but instead, a fundamental debate on so-called “flashpoints,” which are actual, specific points of policy, the results of a debate, which could result in legitimate action, as opposed to simply more principles.
Rebecca MacKinnon soon followed Nelson in critiquing the concept upon which the entire panel was devoted, noticing a trend for the principles and conclusions reached by disenfranchised groups, including those who aren’t in the post-industrial West or in the increasingly powerful emerging economies, to be at best given lip service, and at most outright ignored both by interested parties and IGF itself.
“What’s changed between 2004 and now?” MacKinnon asked. “How do people interpret these principles that have been, less or more, set in some degree of stone for quite some time?”
For the Chinese or Iranian dissident, she posited, rouge groups such as Anonymous and Wikileaks do more for their cause than institutional bodies like IGF simply because they rely entirely upon action instead of dialogue, action that is particularly focused on powerful entities.
For all of the critiques piled on the notion of principles and the efficacy of IGF, there was an equal counter of support.
“The role of the IGF is exactly what it was set out to do. There has been discussion, and it has encouraged awareness,” said Heather Shaw, vice president for ICT policy for the United States Council for International Business.
She added that many of the principles outlined in the State Department report published by the Obama administration contains many of the same concepts that were actively discussed at the previous year’s IGF meetings.
“The fact this discussion is happening everywhere points to the success of the Internet Governance Forum,” said Fiona Alexander of the National Telecommunications and Information Administration. “IGF is spurring these kinds of conversations.”
But the unanswered question lingering at the end of the session’s discussion was whether those conversations, those discussions and that awareness is enough in this day and age, with the Internet’s rapid advancement now being met with an equally rapid growth in governmental interest in its inner workings.
– Morgan Little
Internet Governance Forum-USA, 2011 Best Practices Forum ICTs for Disaster Response: Transforming Emergency Management
Responders are driving innovative uses of ICTs to transform emergency planning, intermediation and management. The Internet and social networking are being harnessed by search and rescue teams to locate and bring vital support to victims. ICTs are reassuring loved ones, bringing help to the stranded, raising financial aid, managing communications for responders and supporting rebuilding. This workshop explored the role communications, Internet and Internet-based applications play in disaster response and recovery operations and steps that can be taken to ensure continuity of operations following a disaster. It also considered the connection between disaster preparedness and Internet governance.
Details of the session:
Information and communication technologies are connecting public safety officials, allowing the efficient coordination of response operations and keeping citizens informed in new ways every day. Responders are driving innovative uses of ICTs to transform emergency planning, intermediation and management.
The Internet and social networking are being harnessed by search and rescue teams to locate and bring vital support to victims. The new Internet-based tools, mobile applications and social media that are transforming disaster relief efforts and empowering citizens were the focus of this workshop at the IGF-USA conference July 18 in Washington, D.C.
This session was moderated by Kelly O’Keefe, director of the Washington office of Access Partnership, a consultancy in international telecommunications trade, regulation and licensing. O’Keefe has a global knowledge base in the topic as she is also a rapporteur for an International Telecommunication Union study group on emergency communications.
The session’s panelists included:
- Joe Burton, counselor for technology and security policy, Communications and Information Policy, U.S. State Department
- Jim Bugel, assistant vice president for public safety and homeland security for AT&T
- Corbin Fields, Sparkrelief, a non-profit Internet-based organization empowering communities to provide disaster relief, http://sparkrelief.org/#
- Roland A. LaPlante, senior vice president and chief marketing officer, Afilias
- Keith Robertory, manager, disaster services technology, American Red Cross
- Tim Woods, technical leader, Cisco Systems
Kelly O’Keefe started the discussion by referring to recent global disasters, from the earthquake in Haiti to the earthquake and resulting tsunami in Japan. These events have demonstrated the importance not only for disaster response, but for relief communication, especially for developing countries, she said.
The biggest trend in disaster communication has been the migration toward Internet-based communications, said Tim Woods of Cisco. The influence and increased use of technology has become more widespread, and increasingly people turn to the Internet, particularly social media, to receive updates on events. Social media, in particular, allow users to send updates to followers immediately in real time.
But despite the widespread prevalence of technology and response services across the globe, the United States does not have the authority to simply step in and start setting up an information system in any country experiencing a disaster. There are differences when responding to a disaster in another country that aren’t problems in the United States.
When the Red Cross responded to the earthquake in Haiti, Keith Robertory said, “We didn’t just say, ‘Hey, let’s get our suntan lotion and see what’s happening.’”
In addition to disseminating information to the public, the Red Cross had a responsibility to talk to the Haitian government and coordinate their needs among Red Cross organizations from other nations.
During such coordination efforts, U.S. organizations cannot make the same kinds of assumptions that they would usually make at home. There are differences in technology cultures that must be taken into account when setting up a communications network during a disaster, said Robertory, of the American Red Cross.
Communicating during an emergency
There should be an interest in the swift restoration of communications infrastructures to save lives in a country experiencing a natural disaster, Joe Burton said. There is a global trend toward catastrophic disasters. In recent history, with the rise of the Internet and social networks, the Internet and text messaging are efficient uses of communications networks.
In terms of the big picture, there are more people with a basic phone, even a lower-end version, than there are who own PCs and TVs combined, said Vance Hedderel, of Afilias. This bigger picture allows disaster response communications to understand how to reach people. At this point in time, the phone is more effective than the Internet. SMS data reach larger numbers of people.
Additionally, the goal of disaster communications should be to inform people experiencing the disaster first-hand. “
A major gap currently exists where those people aren’t getting the necessary information and the outside world seems to know much more,” Hedderel said. “Those issues become so paramount when there is little infrastructure in place.”
When sending out information over the Internet, Robertory said it is critical to hit all social media sites. Since the emphasis is on getting information to the largest number of people possible, the disaster response teams have to reach their audiences across many platforms.
Establishing a network
From the service provider’s perspective, there is an emphasis on critical infrastructure during and after a catastrophic event, Woods said. The networks to be used for information sharing should be reliant and resilient to disruption. A capacity plan needs to be in place to handle an emergency. What often happens is that networks become oversaturated immediately after a disaster, with users attempting to assure others of their safety or provide updates to the state of those affected.
Robertory likened establishing network capacity to a gym membership: “You hope that not everyone comes in to use the treadmills on the same day at the same time,” he said.
Although being able to handle the enlarged capacity that happens after a disaster event is important, a network is not sustainable if preparation for overcapacity becomes slow and expensive. The goal is a balance of capitalism and altruism, life-saving and economy, to make money with the most efficient use of resources possible.
Despite the importance of developing effectively working technology systems, these will be largely useless if various agencies involved cannot work together. Part of preparation is building relationships between agencies and determining who will communicate with whom.
“If you can build those relationships ahead of time, you have a better chance of getting through when disaster strikes,” Burton said.
Another side to preparedness involves having technology that works even in smaller situations, Robertory said. Attempting to prepare a system for a big event from the start leaves too much room for errors when such a situation actually occurs. If the system works for everyday emergencies, it allows time to test it and improve it for smaller upcoming events.
“It’s about being proactive, not reactive,” said Corbin Fields, of Sparkrelief.
– Carolyn VanBrocklin
Data Retention; privacy; security; geo-location; mobility; government/law enforcement cooperation; transnational location issues: these are among the emerging cloud computing challenges in Internet Governance. Promoted by industry and government alike, “the cloud” seems to be the answer in providing emerging online services – addressing costs; access; diversity of infrastructure; reliability; and security. Yet its extremely distributed nature raises Internet governance questions. This workshop addressed the Internet governance questions facing cloud computing, including the emergence of the mobile cloud.
Details of the session:
Where the cloud’s data is located, who has access to it and what happens if it’s breached took center stage during the cloud computing workshop at the IGF-USA conference July 18 in Washington, D.C.
The moderator for the session was Mike Nelson, a professor at Georgetown University and research associate at CSC Leading Edge Forum.
Panelists included a range of industry, governmental and civil organization representatives:
- Jeff Brueggeman, vice president of public policy for AT&T
- Danny McPherson, chief security officer for Verisign
- Amie Stepanovich, Electronic Privacy Information Center
- Marc Crandall, product counsel for Google
- John Morris, general counsel and director of Internet standards, Center for Democracy & Technology (CDT)
- Fred Whiteside, director of cybersecurity operations for the U.S. Department of Commerce, and National Institute of Standards and Technology Target Business Use Case Manager
- Jonathan Zuck, president of the Association for Competitive Technology (ACT)
Georgetown University professor Mike Nelson said governments are happy to use the cloud for cross-border control because it would enable government applications to work better, and it would save money. But the data have to stay within a host country.
“Tension between government controls on cross-border data flows are often caused by the desire for more privacy for citizens in their country versus the global cloud,” he said. “How do we get to a global cloud that is actually globalized, where data is allowed to move wherever it wants to and yet have the private assurances we’ve had in the past?”
There are many who believe location equals control, said Marc Crandall of Google. But that is not always the case when entering various servers and using a resource like the cloud.
“So location may not necessarily equal control,” Crandall said. “The thing about the cloud is I tend to feel that location does not necessarily equal secure. Where something is located doesn’t make it any more or less secure.”
Having governments worry about security standardization and privacy would be a better focus, he said.
Jonathan Zuck, president of the Association for Competitive Technology, said people need to begin to focus on international citizenry in regards to the cloud. It’s not about where the cloud is located or whose cloud the consumers are using, but looking at a larger more competitive group of providers.
And where data are located comes can raise concerns about who has access to that information. If the data are located in a country with little judicial review or fewer privacy regulations, will users’ information be at risk?
“There should be an emerging global standard,” said Jeff Brueggeman, vice president of public policy for AT&T. “As to privacy, the more we improve international cooperation on cybersecurity and law enforcement so that there is more comfort over legitimate concerns that if the data is not stored can they go after a bad guy. But again we have to deal with real issues as well as setting up the right policies to help distinguish between legitimate concern and government overreaching.”
If there is a breach and private information has been hacked, as has been seen in recent attacks against Google and Sony, what should the companies do to be transparent but also uphold their legal obligations?
If an organization is hacked and information is stolen, but that’s not made known publicly, it could be a violation of fair disclosure, said Danny McPherson, chief security officer of Verisign.
“Lots of folks don’t share that type of information,” he said. “Every state or region or nation or union has different native laws and that is extremely problematic in that perspective.”
There are many times that information may not be classified but is of a private nature, such as trade agreements that would need to stay confidential, said Fred Whiteside, director of cybersecurity operations for the U.S. Department of Commerce. It is complex, he said, and as someone who hears many classified discussions on security breaches, he added that it would trouble him for sensitive information to be made public.
Amie Stepanovich, of Electronic Privacy Information Center, said businesses and industries should start worrying about encrypting the information before it is hacked and instead of worrying about the cost-benefit analysis.
“I think the benefit of data encryption is really worth it,” she said. “Its been proven again and again. Companies feel somehow they have to touch that burner to see if it’s hot before they move to that.”
Regardless, while the focus has been on the concerns and security issues surrounding the cloud, there are many benefits that should receive their due credit.
“I think the fact we are all here is a testament to the cloud,” she said. “Or else we wouldn’t be so concerned with what the problems are if we didn’t recognize there are so many benefits of the cloud.”
– Anna Johnson
This panel, moderated by Robert Guerra of Freedom House, focused on critical Internet resources and how to ensure that the underlying principles that have led to the Internet’s success persist in the face of security challenges. These principles include openness (open standards, open technologies), accessibility transparency, bottom-up decision-making, cooperation and multi-stakeholder engagement. Key to implementing these principles is also a broadened understanding of the role of the infrastructure providers, such as global and national Internet services/connectivity providers who build and operate the backbones and edge networks. The panel was also expected to address some of the implications for the implementation of DNSSEC and IPv6 on a national basis that contribute to the security and resiliency of CIR on a global basis.
Details of the session:
The Internet’s success well into the future may be largely dependent on how it responds and reacts to increasing security challenges, according to panelists in a critical Internet resources workshop at the IGF-USA conference July 21 in Washington, D.C.
The Internet continues to evolve. It is also growing, as it becomes accessible to billions more people. The major challenge of our generation is to make the Internet more secure while continuing to promote openness, accessibility, transparency, bottom-up decision-making, cooperation and multistakeholder engagement. It is important that organizations continue to retain these values as much as possible as they react to cybersecurity and cybertrust issues.
Panelists at this workshop included:
- Moderator Robert Guerra, Freedom House
- Trent Adams, outreach specialist for the Internet Society
- Matt Larson, vice president of DNS research for VeriSign
- Steve Ryan, counsel to the American Registry for Internet Numbers
- Patrick Jones, senior manager of continuity and risk management for ICANN
- Jeff Brueggeman, vice president for public policy for AT&T
Panelists all expressed a desire to continue to engage in multifaceted talks because a single governmental entity is not the solution; it takes many people working together. As Brueggeman put it, there’s no “silver bullet” for the issue of Internet security.
“What we do on a day-to-day basis is ensure that those conversations take place,” Adams said. “The (critical Internet) resource is not a thing you can touch. You have this mesh of interconnected components that is the critical resource. You can’t pull one of those components out. Everyone must be around that table.”
So what’s the solution? The answer to that question is still a little unclear because Internet service providers and other organizations are often reactive to issues. Brueggeman said it’s time to embrace a forward-thinking approach.
“Things can get complicated when you’re reacting to an attack,” he said. “The best way to deal with these things is to try to think about them up front. How do we detect and prevent rather than react after the fact? How can we have more cooperative information sharing before attacks to try to prevent them and have the best information we can?”
Ryan stressed, though, that not all government is bad. He said citizens and organizations need to think “carefully about what the role of the government is.” But still, there should be a symbiotic relationship.
“There’s become a sense in government policy circles, including in the most sophisticated, that somehow (the Internet) runs on its own and you can’t break it,” he said. “I have news for you: You can break it. We look at government as something that has an increasingly important role because the Internet has an increasingly important role in economies.”
Ryan continued by saying non-governmental organizations have a responsibility to work with governments and to educate the people who work in them. He and the other panelists agreed that an international governmental organization wouldn’t work, though, unless core Internet values are embraced and upheld. They said a set-up that would allow countries around the world to vote on how the Internet is governed would not be a favorable solution.
“Until we get it right,” Ryan said, “I think we’re muddling along rather well.”
DNS issues and DNSSEC
Larson spoke specifically about the security of the Domain Name System because he views the DNS as an absolutely critical Internet resource. “If you don’t have the DNS, you don’t have the Internet,” he noted. He said users can’t truly trust the DNS, though, which is a bit disconcerting because of its necessity.
He supports DNSSEC—Domain Name System Security Extensions—which give users digital signatures (origin authentication) and data integrity. “Once you have that, you can validate data and have a higher level of confidence that the data you’re getting back is valid,” Larson said.
(You can read more about DNSSEC here: http://en.wikipedia.org/wiki/Dnssec.)
He also said that DNSSEC makes DNS more trustworthy and critical to users as more applications—not just host names—depend on it. “We’re going to look back and realize it enabled a whole new class of applications to put information in the DNS,” Larson said. “Now you can trust the information coming out of the DNS.”
Going from IPv4 to a combination with IPv6
Ryan emphasized the importance of Internet Protocol version 6, IPv6, a new Internet layer protocol for packet switching that will allow a “gazillion numbers” vastly expanding the address space online. There is a rapidly decreasing pool of numbers left under IPv4. Ryan said the increased flexibility of IPv6 will allow for the continued growth of the Internet, but it won’t be a free-for-all.
“The numbers we have issued are not property,” he said. “We have a legal theory that’s embodied in every contract we’ve issued. They belong to community. If you’re not using them, you have to give them back. They are in essence an intangible, non-property interest, so over the next couple of years there will be some very interesting legal issues.”
ICANN in action
Jones said ICANN, which recently passed its 10-year milestone, has continued to work collaboratively with the community to take on major initiatives, such as the introduction of internationalized domain names in the root.
“We have taken requests from countries for internationalized country codes and approved 15,” Jones said.
“There’s a huge development in those regions of the world where you can now have domain names and an Internet that reflects their own languages and scripts. That will have an impact as discussion around critical Internet resources continues, especially in the IGF space.”
Physical critical resources
Brueggeman said AT&T has a broader perspective of critical Internet resources because the company is responsible for carrying Web traffic and for the underlying infrastructure, not just involved in issues tied to the DNS. He said the transition to IPv6 is daunting because it’s not backward-compatible. His main challenge has been in outreach efforts to customers.
“We have to deal with a lot of traffic that’s generated as we’re making changes to DNSSEC and IPv6,” he said. “In some cases, you might create some new security concerns, but overall both are important essential transitions.”
Brueggeman emphasized that multistakeholder discussions will be important in the coming years.
“We really need all of the parties who have the direct stake at the table to be part of the solution,” he said. “We need to have the resources handled in a way that promotes openness and promotes interoperability. There’s a huge policy risk of not managing these resources in a multistakeholder way.”
-by Colin Donohue, http://imaginingtheinternet.org
Panelists shared their philosophical differences about online confidentiality and self-regulation in a discussion about privacy and security implications for Web 2.0 at the Internet Governance Forum-USA conference Oct. 2, 2009, in Washington, D.C.
All panelists agreed that online privacy remains an important issue, and that corporations have an ethical and legal responsibility to ensure that their consumers continue to enjoy some level of anonymity and confidentiality online. But they disagreed about whether self-regulation or government-enforced standards are the best method to achieve that end.
Ginger McCall, EPIC staff counsel, said companies’ privacy policies are often overwrought with technical and legal jargon, making them difficult for users to comprehend. They become too robust that users often click through them without much acknowledgement.
Privacy policies, in my experience, are generally just disclosure policies. They don’t exist to protect users’ privacy. They exist to protect companies from liability. – Ginger McCall
McCall said an overriding concern is that the policies often allow companies to change their guidelines at any time often with no notice to the users.
A bigger problem, still, is that companies are able to collect information about users without ever providing them with the information they have gathered.
“One creative suggestion that I might make is that businesses just give consumers everything they know about them,” said Michelle Demooy, a senior associate of consumer-action.org. “If you’re not a bad actor, it can’t hurt you to give consumers everything you know about them. It can only strengthen your brand going forward.”
Both McCall and Demooy specifically expressed growing anxiety about cloud computing, which allows Web hosting services to house the documents and data of users on their corporate servers. (Think of Google Docs and Gmail, for example.) So what used to be on a person’s personal computer is now on a larger server.
“It’s great for information sharing and collaboration, but not for privacy,” McCall said. “But it allows companies or outsiders to create detailed profiles of users. We need to see a stronger security system and we need to see companies are following through. There needs to be a strong regulation of cloud computing. There should be binding legal standards, terms of services have to be revised and privacy policies must be more transparent.”
Kathryn D. Ratte, from the division of Privacy and Internet Protection of the Federal Trade Commission, said the FTC supports self-regulation not government directives. She says allowing technologies to emerge promotes innovation.
“Our policy has been to enforce self-regulation,” Ratte said. “We analyze what’s going on in the market and put forth standards to adhere to. The flexibility allows us in some ways to act more quickly. We can just address these issues as they raise issues for consumers.”
Jeff Brueggeman, vice president of public policy for AT&T, said the FTC has laid down an ample baseline for legal protection on the Internet that certainly needs continual monitoring but not government intervention.
The FTC is taking a proactive but engaged approach. We don’t give consumers enough credit for the value they place on their privacy. More and more privacy is going to be a marketing advantage that companies are going to assert on the Internet. What we want to have is competition to maintain and secure your privacy, as well. – Jeff Brueggeman
McCall, though, said self-regulation is not a strong enough policy and that legislation with teeth is definitely possible.
“Self-regulation in the Internet context fails because there’s not really enough transparency about what’s going on and what harm is happening,” she said. “A lack of transparency allows companies to act in whatever manner it wants in the short term to make money. It also suffers from the problem in that it only allows for possible remedies after the fact. Having a real comprehensive regulatory system would allow companies to know what’s permissible and not permissible.”
The FTC has come out strongly saying that the rules that apply at time of the collection of data have to continue to apply and if there’s a change. The company should go back to the customer and get opt-in consent. – Kathryn D. Ratte
But McCall and Demooy both said vigorous legislation is possible, and if companies are acting in good faith and treating consumers with respect and responsibility, then they shouldn’t be worried about governmental regulations.
“Privacy policies have their place, but they aren’t really helping consumers,” Demooy said. “If they’re not working, let’s not bang our hammer against that stone. Let’s try to build something that does.”
-Colin Donohue, http://www.imaginingtheinternet.org