At Duke Law, we believe lawyers should be leaders in the ethical development of technology and in guiding clients through the legal fallout when it goes awry. In this section, read how members of the Duke Law community are leading in an ever-evolving and interconnected world.
On a Wednesday afternoon in late January, Associate Clinical Professor Jeff Ward is leading 30 students in his Frontier AI & Robotics course in a wide-ranging discussion of automated vehicles, or AVs.
They start by considering the technological state-of-play: the radar, LiDAR, mapping software, cameras, and other data-recording devices that signal aspirations toward fully automated vehicles and that already allow even modest new cars to include a vast array of automated features. Based on engineering and psychology insights into human-robotic interaction, the class probes the ways that partial automation may cause safety issues. Ward JD/LLM ’09 then shares aspects of the National Highway Traffic Safety Administration guidelines on AVs and offers an overview of the numerous federal, state, and local regulatory bodies that set safety and certification standards for vehicles, drivers, and roads.
The students, 20 from Duke Law and 10 from such disciplines as neuroscience, engineering, and public policy, discuss questions raised by the technology — what data, for instance, is being collected and how might it be used or abused? — as well as the strengths and potential weaknesses of the various legislatures, agencies, and courts charged with overseeing AVs, and the human driving challenges that might be helped or hindered by automation.
As they break into small groups to reflect on what Ward calls the “socio-tech framework” for autonomous vehicles, he reminds them to keep a series of overarching questions in mind: What are the goals of the emerging technology? What should be the roles of the various actors involved, such as those in government, industry, and consumers? Do the technological innovations fit within the existing regulatory framework, or are changes or new rules needed? And what specific policy and legal tools, such as liability regimes, should be used in regulation?
Foundational to Ward’s course design is his belief that machines run by evolving systems of artificial intelligence (AI) and other frontier technologies must develop in ways that protect human well-being and societal values and that, properly deployed, law and policy can further those goals without impeding innovation. The key to making that happen is reflected, in part, in his students’ varied academic paths: His class brings future lawyers together with future policymakers, technologists, and scientists to collaborate on working through the issues.
“At a time when technology is offering so much potential, lawyers have to ask not what technology can do but what it can do to help people and communities to flourish.”— Jeff Ward JD/LLM ’09
Lawyers can’t sit on the sidelines acting solely as counselors and risk managers as transformative and disruptive technologies develop, says Ward, who directs the Duke Center on Law & Technology (DCLT) and was appointed in late December as the Law School’s inaugural associate dean for technology and innovation by Kerry Abrams, the James B. Duke and Benjamin N. Duke Dean of the School of Law. His goal: to empower and enable Duke Law graduates not only to participate in conversations about bringing emerging technology to its highest and best use, but to lead them. “Lawyers can and should be leaders in this sphere,” he says. “They should be the visionaries who help us identify both social norms and ways of steering these technologies accordingly.” And they should be bringing other social thinkers into the conversations, adds Ward, who focuses much of his current research and writing on the implications of advanced AI systems that can autonomously generate highly realistic digital artifacts, or deep fakes. (Read more about Ward’s research.)
“We can’t cede technological development to the technologists. Even more than that, we need to offer something that makes the technologists want us to be part of the conversations.”
Institutionally, it’s an approach that begins with the firm grounding in law and ethics, as well as in legal analysis, research, and writing (including courses in legal technology), that characterize a Duke Law education. Ward’s courses on artificial intelligence, robotics, and blockchain law and policy are just part of a robust program in innovation and entrepreneurship that includes the three-year dual JD/LLM in Law and Entrepreneurship and the corresponding one-year LLM for graduate lawyers. Clinical options like the Start-Up Ventures Clinic and the Community Enterprise Clinic, and research centers such as DCLT, the Center for Innovation Policy, the Global Financial Markets Center, the Center on Law, Ethics and National Security, the Center for the Study of the Public Domain, and the Bolch Judicial Institute, offer multiple opportunities for students to engage directly with technological and legal innovators and decisionmakers. And an increasing number of law students are taking part in campus-wide interdisciplinary research projects focused on emerging technology through Duke’s Science & Society and Bass Connections initiatives and such programs as Rethinking Regulation at the Kenan Institute for Ethics.
Ward sees cultivating an attitude of engagement towards emerging technology as more important than having an aptitude for any particular application. Lawyer-leaders, he says, engage in interdisciplinary dialogue and investigation, stay vigilant about emerging trends in tech and understand what aspects of law, policy, and regulation apply, imagine a nascent technology’s promise as well as understand its capacity for misuse, and are willing to critique and rethink traditional approaches to regulation and governance while keeping societal needs and values paramount.
Says Ward, “In crafting creative solutions to any problem, our mindset is to keep the human at the center. At a time when technology is offering so much potential, lawyers have to ask not what technology can do but what it can do to help people and communities to flourish.”
Developing a “symbiotic relationship” with technology
Professor of Law and Philosophy Nita Farahany ’04, PhD ’06, is dedicated to cultivating that mindset in students, both through her research and teaching at the Law School and leadership of the Duke Science & Society initiative, which she founded in 2013. She forged her own interdisciplinary path as a Duke student seeking to gain an educational grounding in law, philosophy, and science that would inform her understanding of the implications of scientific discoveries and emerging technology. “Now I try to create those same opportunities for students,” says Farahany, a leading scholar on the ethical, legal, and social implications of emerging technologies and chair of the Duke MA in bioethics and science policy. (Read more about Farahany’s research.)
Technical literacy is already a skill that good lawyers must have, Farahany says, but the many interdisciplinary curricular and investigative opportunities at Duke can help students learn to go deeper to forecast what technological and scientific advances mean for society and communicate with experts from different areas to ensure they proceed ethically.
“It’s rarely, if ever, about stopping a technology or saying this is ethically impossible,” she says. “Instead, it’s figuring out if there is a way for society to be an active partner in helping to shape the direction and applications of the technology and research that occurs in ways appropriate to all subjects and stakeholders who might be involved. “I think the most important roles going forward are for effective leaders who can serve as the bridges between spotting the connections and helping people to see the connections.”
Farahany stresses the importance of early engagement with new technology, pointing out that she spends significant amounts of time reading scientific journals. That’s how she recently learned of a study aimed at erasing the memory of cocaine use in rats through the use of a neural simulation technique called optogenetics. The study could hold promise for addiction treatment if, in fact, pleasurable memories of being high could be reduced or eliminated, she says.
“I think the most important roles going forward are for effective leaders who can serve as the bridges between spotting the connections and helping people to see the connections.”— Nita Farahany ’04, PhD ’06
“Obviously this memory study in rats has possible implications for human research on memory modification. And memory modification has huge implications for law — we deal with witness memories all the time. We deal with individuals who have traumatic memories of past experiences and award damages in tort law to compensate for that trauma.” Farahany fires off a list of questions that the technology could raise: What does it mean to be able to modify memory? Should it be allowed at all? How would the law regard a witness whose memory was modified? What does it mean for an addict charged with drug possession who might be treated through memory modification?
“That’s the kind of work you have to do to build effective bridges,” she says, noting that the researcher doing the study on rats isn’t thinking about the implications for law, and law students, legal scholars, and judges usually aren’t reading scientific journals about rat studies.
“If you wait until the technology is fully here, we’re already behind as people seek to implement it, and apply it in ways that may have negative ethical or legal implications for society,” Farahany says. “So we aim to inspire in students a curiosity to engage early in technological and scientific development and couple it with a depth of knowledge that allows them to identify new advances and think creatively about the science-fiction version of how these might unfold or be applied.”
At the same time, thinking critically is also essential, says Farahany, who is the principal investigator of SLAP Lab (Science, Law & Policy Lab), with research projects focused on fashioning a right to likeness for non-celebrities and contemplating what misuse of an ordinary person’s likeness might mean, how claims by young adult defendants rooted in developing brain theory are faring in courts, and public attitudes towards AI decision-making in different contexts. She warns against assuming that “every new thing” is a breakthrough, emphasizing the importance of replication in science.
“And even if it works in the lab, can it work in another context?” Farahany asks. “You have to be very wary of misapplication.” That is particularly true in criminal proceedings when an individual’s life or liberty may be at stake, she says.
“The earlier you can immerse yourself in a technology and understand both its promises and its limitations, the more effective you can be as a law student or a lawyer in guiding judges and others in deciding whether it should be admitted in court or applied for forensic purposes, or as a sentence if, for instance, someday a pedophile could have pleasurable memories of being a pedophile eliminated. Even if those possibilities are decades away, you need them on your radar. “We have to train our students to understand what technology is out there, how to learn about new technologies that may not be out there yet but will be, and how to develop a symbiotic relationship with technology to enable them to be lawyers of the future.”
Assessing and expanding the regulatory toolbox
To nurture disruptive advances and ensure they provide the maximum benefit to society will require a rigorous and forward-thinking approach to regulation. Indeed, assessing the applicability of regulatory measures to emerging technologies and altering or expanding them as required is the primary purview of lawyers and an ever-growing focus of faculty research and teaching.
Says Farahany: “Any time you’re asking about regulation of technology, you’re asking, really, what’s the right balance between progress, ethical progress, innovation, and social welfare more generally? You don’t want to overregulate to impede progress. You don’t want a ‘let-’er-rip’ model where technology is allowed to develop without any oversight.” And oversight of AI-based technologies that may be used to predict such high-stakes matters as the likelihood of an inmate up for parole to reoffend or which cancer drugs will most effectively shrink a tumor or which vehicular safety measures are needed in dangerous driving conditions demands a level of operational transparency that doesn’t eliminate trade secrecy as an incentive for innovation, she says.
“The challenge is, then, to find the right balance between encouraging progress and balancing that against some of the concerns that can arise.”
Professor Barak Richman, a scholar of health care policy, antitrust, and institutional economics, says the solution to assessing and expanding the regulatory toolbox involves, in part, a renewed commitment to legal creativity. Lawyers, says Richman, the Edgar P. and Elizabeth C. Bartlett Professor of Law and Professor of Business Administration, need to draw lessons from the team approach tech and other industries apply to research and development: “You want a legal ‘R & D’ team that thinks about the laws and regulations that need updating. Maybe some are being misapplied or applied over-broadly. But if you think about laws creatively and dynamically, then you will have an eye for what the technological and economic needs are as opposed to just thinking about the law as an inflexible ‘black-letter’ framework that you have to apply.”
“[I]f you think about laws creatively and dynamically, then you will have an eye for what the technological and economic needs are as opposed to just thinking about the law as an inflexible ‘black-letter’ framework that you have to apply.”— Barak Richman
Avoiding “regulatory ossification” isn’t just about motivating Congress, says Richman, who has been examining the capacity of the regulatory framework to address the provision of health care services in his scholarship — and finding it lacking. (Read more about Richman’s research.) “It’s about thinking of what flexibility current regulations have, how certain rules can be revisited, and how certain agencies can be engaged.”
Richman regularly teaches a mix of law and business students in his health care policy course. “That makes the experience more akin to having lawyers in the C-suite, as opposed to having them cordoned off in the legal department,” he says. “It allows lawyers to think entrepreneurially like business students do. And it encourages the business students to understand how valuable a legal strategy can be to overall corporate strategy.”
Richman’s goal is to encourage his students to think about the law in a prospective way and not just a categorical, rigid, retrospective way. “I start with the fact that we have a health care system that has some very serious problems,” says Richman. “Let’s understand what those are, then look at these major areas of law and think about how they might mitigate problems or how they might exacerbate problems. You need to be problem-solving, you need to be interdisciplinary, you need to think across traditional categories.
“It’s not just a better way to be a lawyer, but it also shows how much value creative and smart lawyering can bring to these areas of technology,” he says. “If we can re-conceptualize the way current law should be applied, and it’s different from the way current law is being applied, then we can be as necessary a part of the creative process as the engineers or software developers.”
Fintech, bitcoin, blockchain, and beyond
In his Fintech Law and Policy course, Lecturing Fellow Lee Reiners frequently addresses regulatory challenges facing the financial services industry, which has been substantially transformed by the entry of tech-enabled non-traditional players over the past decade. “In finance, technology is being used in radically new ways,” says Reiners, executive director of the Global Financial Markets Center (GFMC). “Someone can now start a business offering loans to small businesses with nothing but a laptop.”
The banking industry is heavily regulated, Reiners points out, and whether they are advising an established fintech client or working in-house with a start-up, Duke Law graduates will be relied upon to steer clients through choppy waters.
Reiners aims to give them the knowledge and skills to navigate both scenarios. Examining the use of technology in such services as lending and payments, wealth management, and account aggregation, as well as cryptocurrencies enabled by blockchain technology, he asks his students to think through how they fit within existing regulatory structures and, if it’s unclear, what policy changes may be needed.
“One of the things students come away with from my course, is an understanding of how fragmented the U.S. financial regulatory structure is,” he says. “We have an alphabet soup of regulators that have different opinions and approaches to new technology. What does that mean for the development of fintech? From a young lawyer’s standpoint, it represents a great opportunity.” Many of his students, in fact, find themselves being the first to apply legal research and analysis to fintech-related topics in their term papers.
The regulatory “soup” applicable to cryptocurrency like bitcoin is particularly messy, Reiners explains, with the Commodity Futures Trading Commission (CFTC) classifying it as a commodity, the IRS treating it as property, and most states considering it to be money, which means that cryptocurrency firms must register as money transmitters in each state they wish to operate in. The Treasury Department’s Financial Crimes Enforcement Network (FinCEN), which is responsible for combatting money laundering and terrorist financing, also considers cryptocurrency to be money, meaning cryptocurrency firms are required to flag suspicious transactions and file the relevant reports with FinCEN.
In articles on the GFMC’s FinReg Blog (read more) and in his online course on the Coursera platform, Reiners has addressed the way cryptocurrencies have entered the mainstream since bitcoin and its underlying technology, blockchain, were unveiled in an October 2008 white paper. “There are now over 2,000 cryptocurrencies and they’ve spawned an entirely new way for companies to raise money, with blockchain being utilized across multiple industries,” he says. “Mainstream financial institutions are now getting involved. It’s hard to point to something that’s had that big of an impact in such a short period of time.”
The underlying purpose of blockchain — to keep an immutable and secure ledger of transactions that is not controlled by any one entity — has the potential to be transformative in ways beyond cryptocurrency, Reiners says. “It could change the way society operates, pushing aside centralized institutions and facilitating a reclamation of control over our information, our privacy, and our security.”
In addition to examining blockchain’s disruptive potential for financial and other intermediaries in his Blockchain Law and Policy Lab, Ward also considers its promise for democratizing access to resources and promoting civic participation. It remains nascent, says Ward, who is fond of categorizing technological advancements as being either “here,” “near” — the category in which he places most blockchains — or “oh-my-dear,” referring to the stuff of hair-raising science fiction.
“Blockchain causes some challenges as an omni-use technology,” he says. “Almost every single characteristic of blockchain can be both good and bad. For example, its partial anonymity might offer women financial autonomy in systems where banking is disallowed for them while concurrently obscuring the fundraising activities of terrorist groups and impeding the ability of financial intelligence units to do their jobs.”
The challenge, he says, is to recognize and facilitate development of the technology’s promise while also mitigating its potential harms. “That’s not just a legal or policy task, but a legal, policy, technical, and cultural task. You have to have all sorts of pressures on the system.
“If we acknowledge the fact that our current regulatory system is always going to have to be adapting, how can we build tools to anticipate where something might go? We have to recognize the technological potential and apply creativity and imagination to scope out all of its possible uses and consequences. No technology is neutral in its social impact.”
Ethical tech: Human needs are paramount
During the Law School’s Wintersession in early January, Ward and Richman teamed up to teach a course titled Designing Creative Legal Solutions, premised on the importance of bringing people with a range of perspectives together and looking beyond traditional boundaries to solve complex legal and social problems. Both use variations of this “design thinking” technique in their broader teaching.
During their course, they teamed students with local housing and public health officials, the chair of the Durham County Commission, a community activist who has faced homelessness, and faculty from the Law School’s Civil Justice Clinic to design solutions to Durham’s eviction crisis. The guests were also able to give them practical perspectives on eviction and its often-devastating effects on individuals and families. Outside of class, students also delved into the court dockets and reached out to faith communities, real estate developers, and others interested in housing to gain further insights into the problem and the “pain-points” of all parties involved.
With much of the course devoted to small-group brainstorming on ways to reduce the negative impact of evictions, both Richman and Ward stressed the importance of teamwork and bringing an open and non-judgmental “beginner’s mindset” to the process. “Lawyers can be bad at suspending criticism,” Ward said. “Do not say the word ‘but,’ and instead respond to an idea with ‘Yes, and … ’”
Many of the ideas that emerged from the course involved legislative changes and incentives to construct affordable housing. Only one suggestion depended on technology, an online “legal-circumstances matrix” that would allow judges to understand the specific challenges of locating alternative housing for each tenant before them to find a fair and workable solution for all parties. And that cuts to the essence of design thinking: Human needs are paramount, with technology just offering tools to deliver legal or policy tools in new ways.
“[Design thinking is] simply a way of encouraging lawyers and others to look at a system and say, ‘It doesn’t have to be this way.’ So it’s a prerequisite to making sure that technology ends up playing the role that it needs to play and isn’t simply used to entrench incumbent players and maintain gaps in equality and access.”— Jeff Ward
That human-centeredness was also apparent in Ward’s last blockchain class of the fall semester, which he devoted to an exercise in ethical tech development and design thinking. He challenged his students to apply the attributes of blockchain technology, such as the ability to move value or assets like data, coordinate a network of distributed actors, and to sidestep intermediaries, to the design of a social-media network and a ride-sharing service.
One group, identifying the need to minimize fake news and “click bait” on their social-media platform, came up with a plan to offer users utility tokens exchangeable for services or site access as an incentive for reading ads. Another group with similar goals proposed offering utility tokens to site users who checked facts in social media posts. And a group designing a ride-sharing service proposed building employment protections for drivers into the digital architecture through the use of smart contract code.
At the end of the class, Ward had each student write one guiding principle or rule of conduct for principles of ethical tech development on sticky notes. Many of their suggestions, he reports, focused on ways to balance innovation and safety, ways to include marginalized communities in technology development, and ways to reshape cultural understandings of privacy.
Ward says he is increasingly committed to design thinking as a “process, an attitude, and a key component of modern legal education,” but he is quick to note that the process should never “fetishize technology.” Rather, technology should be viewed as a useful tool to address human need: “It’s simply a way of encouraging lawyers and others to look at a system and say, ‘It doesn’t have to be this way.’ So it’s a prerequisite to making sure that technology ends up playing the role that it needs to play and isn’t simply used to entrench incumbent players and maintain gaps in equality and access.”
With technology developing exponentially, from AI systems used in diagnostics, decision-making, and data collection to the emergence of sophisticated brain-computer interfaces, lawyers have a special role to play in shaping its governing principles, protocols, and use, Ward says. “We’d better step up. But these reflect really exciting challenges for lawyers.”