It seems that Asia is currently the leading continent when it comes to putting legal frameworks in place to allow self-driving cars to run public tests and be used commercially, leading to a number of tech firms heading to Asia for autonomous testing. A large number of companies throughout the continent are also involved in the self-driving car arms race in some form or another; China, for example, has local tech firm Baidu producing its own self-driving cars and networking leader Huawei pushing to create and maintain the 5G networks that will help those cars do their jobs. Europe is not too far behind in this regard, with the UK allowing fully autonomous testing on a city-by-city basis. Germany, meanwhile, allows hands-off testing, but holds automakers responsible for any issues, and requires that systems prioritize human lives above animals and property, which means that autonomous cars made for testing in Germany may have to be tweaked to work well elsewhere.
All of this stands in stark contrast to one of the premier breeding grounds for self-driving car technology, the United States. There currently exists no federal set of laws in the US that govern autonomous vehicles specifically, so automakers and tech firms face down a grid of different laws from state to state, with many states disallowing testing entirely for now. While a federal law set may be on the cards in the near future, the current state of affairs makes it hard for self-driving car makers to implement consistent programs for testing and deployment. Since most self-driving car AI programs operate in a sort of gestalt, this means it's more difficult for the cars that are on the road to get the data they need to take on new situations, which in turn makes it harder to convince regulators that the cars are safe enough for public roads in more places. This self-fulfilling prophecy is likely to continue until federal lawmakers finally approach the issue head-on.
Legal issues have long been one of the biggest hindrances to full deployment of self-driving cars, and even so, the technology has already been held at least partially responsible for a few accidents, some of which have been fatal. In one recent case, for example, an Uber self-driving car failed to spot a pedestrian veering suddenly out onto a dark highway outside of any marked crosswalk, resulting in the pedestrian being struck by the car and killed. Incidents like that one demonstrate the need for this technology to advance far beyond the point of merely being on par with human drivers before wide public adoption can begin, and it is likely at least partly for this reason that lawmakers are finding it difficult to reach a worldwide or even nationwide consensus on guidelines.