Posts Tagged ‘Center for Democracy and Technology’
Brief session description:
Thursday, July 26, 2012 – The US House of Representatives has passed four cybersecurity bills, and the US Senate has indicated an intent to consider cybersecurity legislation in the current session. The US Department of State is working with its global partners to develope relationships, collaborative action and norms of behavior for cyberspace. The US Department of Commerce has spearheaded a government initiative on botnets and is working with industry on botnet mitigation measures. The Department of Homeland Security is increasing its cybersecurity staffing for strategic and operational concerns. And the White House is transitioning its team on cybersecurity policy with a second cybersecurity adviser to the president. Stuxnet and Flame attacks have captured international attention. Cybersecurity remains a key theme in discussions in the United Nations, the International Telecommunications Union, the Organization for Economic Cooperation and Development, the Asia-Pacific Economic Cooperation, ICANN and the annual Global Internet Governance Forum. This workshop addressed questions such as: What are businesses, countries, and the technical community doing in this heightened era of cyber security concern? What should they be doing? What are the considerations for individual users here in the U.S. and around the world? How can all these pockets of activity help protect – and not hamper the protection of – the very medium that provides for productivity, communications, efficiencies, innovation, and expression?
Details of the session:
The session was moderated by Audrey Plonk, global security and Internet policy specialist at Intel Corporation. Panelists were:
- Tom Dukes, senior advisor, Office of the Coordinator for Cyber Issues, US Department of State
- Jeff Greene, senior policy counsel, Cyber Security and Identity, Symantec
- Kendall Burman, senior national security fellow, Center for Democracy and Technology
- Patrick Jones, senior director of security, ICANN
Panelists from the government and private sectors gathered at IGF-USA’s cybersecurity workshop to discuss how these entities are collaborating to deal with domestic cybersecurity threats and international cybersecurity issues.
This issue is especially pertinent right now. There have been a number of high-level conferences and meetings in Washington and other locales over the summer of 2012 on this topic, and, as moderator Audrey Plonk, global security and Internet policy specialist for the Intel Corporation, puts it, “Cybersecurity is the new black.”
Jeff Greene, panelist and senior policy counsel of cybersecurity and identity at Symantec, agreed. “At this time three years ago, cybersecurity was something that was mentioned in passing,” he commented. “Now the interest is exponential.”
Symantec’s business is centered on protecting enterprises from cyberthreats. Greene, who until recently worked with the Department of Homeland Security, said that according to this year’s Symantec Internet Security Threat Report, 75 percent of the enterprises Symantec deals with were threatened with a cyber attack in 2011.
He added that while the incidence of spam decreased in 2011, there has been a shift to web-based attacks. Greene also said the government and private sector are working together to reduce such threats.
“It is remarkable how much of the threat dynamic in both sectors is the same,” Greene said. “We see criminal and other malicious activity largely the same as the government does, so this is all work through government, private and international cooperation.”
Panelist Kendall Burman had a different view on government access to private sector and citizen information in terms of cybersecurity. As a senior national security fellow for the Center for Democracy and Technology, she has spent time exploring security and surveillance from the perspective of a member of a group focused on consumer privacy.
“I think that the tricky area from a civil liberties perspective is when the government is in a position of receiving that information, making sure that that information is limited to cybersecurity threats, and what the government can then do then once it receives it,” Burman said.
Panelist Tom Dukes, senior adviser for the Office of the Coordinator for Cyber Issues at the US Department of State, weighed in from a government standpoint on cybersecurity issues, including the important role of the US government in pushing other countries to increase their outreach and share their perspectives on cybersecurity issues.
“Obviously what the US says, the positions we take, are highly influential and they are certainly looked at by a great many other countries,” Dukes said.
“One thing that the US has been trying to do for the last couple years in terms of addressing cyberpolicy issues in general, cybersecurity included, is to try to take sort of a leadership role in helping shape the world debate on how we think about these issues.”
Dukes said that the US has also made progress in terms of leading a global discussion on reaching a consensus about cyber security norms. Greene said that while the U.S. would like to set its own cybersecurity policies, this could cause global problems.
“If everyone has a different set of rules, (global policymaking)’s going to be pretty difficult,” Greene said.
Panelist Patrick Jones, senior director of security for ICANN, shared his view that while US policymaking is important in terms of cybersecurity, politicians should be aware of the effects that any laws they make may have globally.
“It’s helpful for policymakers, when they’re coming up with legislation, that they think of the Internet as global and consider that the decisions they make may have technical impacts that they’re not considering that impact the way people are using the Internet today – give those a thorough understanding before decisions are made about a particular legislation,” Jones said.
One of the final points of discussion during the workshop was the differences between cybersecurity and information security.
In the discussion it was noted that cybersecurity, in the US view on Internet governance, deals primarily with protection from Internet threats. Information security, in the Russian and Chinese view, also includes censoring the civic sector and content from many Western media and knowledge organizations.
Dukes said there are two considerations for openness and freedom of information that convince most leaders in the world to find common ground in the fairly liberal US position on cybersecurity issues.
First is the basic human rights aspect of the argument; many countries accept that people should, whenever possible within the bounds of public safety, have certain rights of free speech, communication and assembly. Most countries agree that this should apply online.
Dukes’ second point is the economic benefit of keeping the Internet as open and free-flowing as possible. “Many evolving world countries are really desperate to find ways that they can harness the power of the Internet to increase economic opportunity, to increase GDP, to increase development and growth,” he said. “Those arguments seem to be very pragmatic, but it’s hard for countries to disagree with that.”
— Mary Kate Brogan
Brief session description:
Thursday, July 26, 2012 – This workshop focused on the challenges of keeping the Internet open while simultaneously maintaining a safe and secure environment for individuals, businesses and governments. Governments encounter a wide ranging set of issues and concerns that can limit an open Internet, including the cost of connectivity, spam/malware, intellectual property rights, human rights and objectionable content. Businesses often make decisions for business purposes that may contribute to closing off the Internet. Leaders in governments’ legislative branches, including the US Congress and its counterparts around the world, and business leaders do not always recognize the implications of the actions they take that might negatively influence the Internet. In addition, citizens may voluntarily but without full understanding accept moves that contribute to closing off the Internet, quietly accepting actions and decisions that affect its openness in a negative way. The session worked to identify the key characteristics of an open Internet; the global and national challenges that threaten this; the initiatives pursued to advance the open Internet; multistakeholder engagement to develop and promote an open Internet.
Details of the session:
The session was moderated by Robert Guerra, principal at Privaterra and senior advisor to Citizen Lab in the school of global affairs at the University of Toronto. Panelists were:
- Ellen Blackler, vice president for global public policy, The Walt Disney Company
- Thomas Gideon, technical director of the Open Technology Institute at the New America Foundation
- Andrew McDiarmid, policy analyst at the Center for Democracy and Technology
- Julian Sanchez, research fellow at the Cato Institute
- Paul Diaz, director of policy for the Public Interest Registry
- John Morris, director of Internet policy, office of policy analysis and development of the US National Telecommunications and Information Administration
Between copyright infringement, intellectual property, piracy and protection of online privacy, the openness of the Internet is being threatened on all sides, according to six IGF-USA panelists, who gathered to define and assess the challenges to an open Internet Thursday at Georgetown Law Center.
“The free and open Internet oughtn’t be a free-for-all,” said Ellen Blackler, vice president for global public policy for The Walt Disney Company.
A focus on the balance between maintaining an open Internet while ensuring security and privacy and minimizing piracy has always loomed as one of the largest challenges to the future of the Internet. While members of this panel represented diverse Internet backgrounds, the all agreed that Internet policy must and will continue to evolve with challenges posed by the struggle between these often-competing values.
What is Internet openness?
The definition of an open Internet differs even within seasoned IGF attendees.
John Morris of the National Telecommunications and Information Administration (NTIA) cited the principles of Internet openness recommended by the Organisation for Economic Co-operation and Development (OECD) last year, which highlight several key characteristics, including the opportunities for both collaboration and independent work.
An open Internet allows users to operate “independently of one another, so as not to have a centralized single body to control or impose regulations,” Morris said.
The Internet policymaking process additionally needs to be open for collaboration, Morris said.
“What is it that keeps barriers low, what steps can we take to address challenges?” asked Andrew McDiarmid, policy analyst for the Center for Democracy and Technology (CDT). “It’s about learning … to keep the process more open and open to more voices.”
Though the openness of the Internet is one of the Web’s key characteristics, challenges ensue when openness trumps privacy.
“The openness principle has failed the public in privacy interest,” Blackler said.
U.S. policies directly affect those abroad
In the United States, Internet access is virtually everywhere, but the major challenge for Internet openness in many other parts of the world is online accessibility, especially in remote areas and in developing nations.
“Access at an affordable cost is key because then we can innovate,” said panel moderator Robert Guerra, the founder of Privaterra.
Panelists agreed that though global policies across the board on the issues tied to Internet openness are unlikely to be established due to differing cultural values and standards from country to country, cooperation on the international scale is still quite important.
“Not that I think we need to achieve one global norm about a particular issue, but we need to achieve a global level of interoperability,” Morris said.
In some countries, global Internet operability is a major issue due to government blocking and filtering–the management of what content citizens may or may not access or share. Thomas Gideon of the Open Technology Institute noted the difficulties that global policymakers face with nations that exercise a great deal of control over available content.
“A large part of what I do in my work is to defend human rights online,” Gideon said. “That’s equally fraught with the risks that those trying to speak freely in contentious and crisis regimes face.”
Paul Diaz, director of policy for the Public Interest Registry noted the challenge of governance measures working locally and globally. “What works in one environment, what may work here in the US, is not necessarily applicable in another country” he said. Ultimately, the Internet is global and therein lies the challenge.”
Piracy and copyright: What is the solution?
When discussing the widespread nature of piracy online and the difficulty in regulating it, panelists differed in their preferred approach to dealing with the challenges of intellectual and copyrighted property.
“Companies like Netflix are slowly finding ways to shift from a product to a service model,” Julian Sanchez, a research fellow at the Cato Institute, said, suggesting this as one successful choice for property owners.
Sanchez argued that the best way to discourage piracy is to create services that offer consumers a wide variety of choices and control over consumption of goods at a fair price. He said this is a better method than exclusively offering products that can be copied and shared and pirated just as easily.
Private niches online: Social networking and the cloud
With the advent of social networking and the desire to share and access personal information, the Internet includes private and targeted content, as well.
Sanchez emphasized that the structure of the Internet should be seen more as a network of people and relationships than as a technological architecture.
Facebook’s Sarah Wynn-Williams said social networking represents the “desire for people to connect and share and be open,” adding that the future of Internet policy must meet these demands and “preserve the ability of people to [share personal content online,] which is genuinely under threat.”
Panelists also noted that files shared through cloud data storage continue to be as difficult to regulate as physically shared materials. Just as the government has often largely chosen not to investigate copied CDs or cassettes that become distributed among friends, content in the cloud is as difficult to trace and regulate.
— Madison Margeson
Internet Governance Forum-USA, 2011 New Challenges to Critical Internet Resources Blocking and Tackling – New Risks and Solutions
The security, stability and resiliency of the Internet are recognized as vital to the continued successful growth of the Internet as a platform for worldwide communication, commerce and innovation. This panel focused on domain name service blocking and filtering and the implementation of Internet Protocol version 6 (IPv6) and critical Internet resources in general. The panel addressed some of the implications of blocking and filtering; the implementation of DNSSEC and IPv6 on a national basis; and future challenges to the Internet in a mobile age.
Details of the session:
The Internet’s success well into the future may be largely dependent on how it responds and reacts to new challenges, according to panelists in a session on critical Internet resources at the IGF-USA conference July 18 in Washington, D.C.
The Internet continues to evolve. It is also growing, as it becomes accessible to billions more people. A major challenge now and in years to come is to make the Internet more secure while continuing to promote openness, accessibility, transparency, bottom-up decision-making, cooperation and multistakeholder engagement. It is important that organizations continue to retain these values as much as possible as they react to cybersecurity and cybertrust issues.
This workshop was conducted in two parts, both moderated by Sally Wentworth, senior manager of public policy for the Internet Society. Panelists included:
- John Curran, president and CEO of the American Registry for Internet Numbers (ARIN)
- Steve Crocker, CEO and co-founder of Shinkuro, and vice chair of the board for ICANN, the Internet Corporation for Assigned Names and Numbers
- George Ou, expert analyst and blogger for High Tech Forum http://www.hightechforum.org/
- Rex Bullinger, senior director for broadband technology for the National Cable and Telecommunications Association (NCTA)
- Paul Brigner, chief technology policy officer, Motion Picture Association of America (MPAA)
- Bobby Flaim, special agent in the Technical Liaison Unit of the Operational Technology Division of the FBI
- David Sohn, senior policy counsel for the Center for Democracy and Technology (CDT)
- Don Blumenthal, senior policy adviser for the Public Interest Registry (PIR)
- Jim Galvin, director of strategic relationships and technical standards for Afilias
Wentworth began the session by referring to it by an alternate title, “Blocking and Tackling.” The alternate title proved appropriate, as the two sets of panelists became more and more passionate in their assertions during the discussions on two topics that affect the future health of the Internet: the implementations of IP version 6, or IPv6, and especially Domain Name System (DNS) blocking and filtering.
The first panel consisted of John Curran, Steve Crocker, Rex Bullinger, Jim Galvin and Bobby Flaim. It centered on a discussion of IPv6 and the difficulty in implementing a system that is viewed as the “broccoli of the Internet” – something that is technologically necessary, but for consumers is less of a priority because a switch is not incentivized.
The technological necessity is an inevitability. IPv4 has 4.3 billion independent IP addresses. The central pool of addresses ran dry Feb. 3, 2011. The last five blocks were passed out to the five regional address distributors. Depending on the rate of growth, Curran explained, those addresses may not last very long. In fact, the Pacific region has already handed out all its addresses. The 7 billion people in the world can’t fit into 4.3 billion addresses, especially when most have more than one address to their names.
“We have to move, the hard part is getting them to move,” Curran said, referring to consumers.
The biggest problem is that IPv6 and IPv4 are two languages that don’t talk to each other.
“You have to enable IPv6 in order to speak to IPv6 Internet users. It’s literally possible for us to break the global Internet if we don’t transition to IPv6,” Curran said.
The dual stack model was identified by the Internet Engineering Task Force (IETF) to be the most effective and efficient way to begin integrating IPv6 into the lexicon.
“What we have is a coexistence processes rather than a transition,” Crocker explained. “We’re going to have these two existing parallels. You’ve got pure IPv4 exchanges and conversations, pure IPv6 exchanges, then you’ll have somewhat more complex interchanges between IPv4 users and IPv6 systems and those will require some sort of translation.”
“The ISPs,” Curran said, “are stuck in the trap as to whether there’s enough demand to make the switch.”
Currently, most devices, such as laptops, are IPv6 enabled and have been for some time, so they’re coexisting, rather than transitioning directly from IPv4, Curran said.
“Things are not going to be replaced overnight,” Galvin said. “The providers are ready, but the customers aren’t. The laptops are ready, but the interaction is not ready.”
There are other problematic implications of enacting the new IP version, particularly as it relates to ensuring that how the IP addresses are being allocated and to whom and logging NATS, according to Flaim. Another element was addressed by an audience member – the possible advantages possessed by countries with less infrastructure than the United States. Is the United States being left behind because its Internet is too big? Crocker contended that it’s a possible scenario that some developing regions could leap frog over IPv4 and go directly to IPv6.
DNS Blocking and Filtering
The second panel of the afternoon consisted of Crocker, George Ou, Paul Brigner, Galvin, David Sohn and Don Blumenthal and centered on a lively discussion about the merits of DNS blocking and filtering as an answer to copyright infringement.
The panel was divided on its views – some felt that DNS filtering and blocking represented the perfect answer to copyright infringement and media piracy, while others felt that a solution based on technical adjustment was the wrong way to approach what they deemed a human problem, not a technical problem. While a consensus was not achieved, the discussion addressed serious concerns about the damaging effects of illegal downloading on the content production industry, as well as the greater constitutional, technical and political implications of the use of DNS filtering and blocking.
Panelists referenced the Protect IP legislation that is currently in the U.S. Senate. The legislation is aimed at off-shore websites that infringe copyright laws by hosting pirated media. One of the ways the bill works is to undercut the sites by going after their advertising and funding sources.
The trouble, Crocker explained, is that the blockages or filters are not only “trivial to get around” but the motivation is there. Sohn agreed that DNS filtering is not beneficial, especially in this case, because it is easy to evade. Using DNS filtering to prevent users from accessing entire domains that may contain illegal content rather than addressing the content itself on specific websites is too broad, Sohn suggested.
Galvin agreed: “While content providers have a legitimate need to protect their assets, there seems to be the automatic assumption that DNS filtering is the right way to do that. You’re concerned about content, so you want to do content filtering.”
Galvin cautioned the panel and the audience to be aware of consequential damages.
Sohn also raised concerns about prior restraint. “Under prior restraint law, you cannot restrict speech unless it’s actually proven illegal,” he said. “DNS filtering orders would often occur on a preliminary basis, as part of a restraining order, rather than after a full legal determination by a court.”
There was further discussion that on a technical level, the use of these tactics would be problematic because it would break DNSSEC and destabilize the Internet. Especially when DNSSEC was designed to detect just that type of illegal activity, Crocker maintained.
On the other side of the issue, Brigner explained that in using DNS filtering, criminal sites would be removed from the “global phonebook,” preventing individuals from accessing them and propagating the consumption of illegal media.
“We’re not asking for a new row of cannons,” he said in reference to Crocker’s earlier metaphor about the Vassa, a Swedish warship that sank because its poor design was based on the king’s desire for more firepower in spite of architectural faults. “Let’s use sound engineering principals.”
An audience member’s suggestion of the use of an industry seal program was also met with varying levels of support and dissension. Ou expressed that an industry seal program may be easily counterfeited, while others felt that using a “human” solution rather than a technical solution was a more appropriate answer to the problem.
In the end, the dialogue raised many new questions about the real world implications of IPv6 and DNS blocking technologies.
“It’s important that these various communities find a way to address the issues in a way that’s respectful of the technology, respectful of the global Internet and people’s need to carry out their business, and the freedom of expression,” Wentworth said in closing.
“There’s a lot of competing interests, but they don’t have to be mutually exclusive.”
– Bethany Swanson