Enlarge / Indiana Chief Justice Loretta Rush.Indiana Supreme Court

An Indiana man may beat a drug prosecution after the state's highest court threw out a search warrant against him late last week. The search warrant was based on the idea that the man had "stolen" a GPS tracking device belonging to the government. But Indiana's Supreme Court concluded that he'd done no such thing—and the cops should have known it.

Last November, we wrote about the case of Derek Heuring, an Indiana man the Warrick County Sheriff's Office suspected of selling meth. Authorities got a warrant to put a GPS tracker on Heuring's car, getting a stream of data on his location for six days. But then the data stopped.

Officers suspected Heuring had discovered and removed the tracking device. After waiting for a few more days, they got a warrant to search his home and a barn belonging to his father. They argued the disappearance of the tracking device was evidence that Heuring had stolen it.

During their search, police found the tracking device and some methamphetamine. They charged Heuring with drug-related crimes as well as theft of the GPS device.

But at trial, Heuring's lawyers argued that the warrant to search the home and barn had been illegal. An application for a search warrant must provide probable cause to believe a crime was committed. But removing a small, unmarked object from your personal vehicle is no crime at all, Heuring's lawyers argued. Heuring had no way of knowing what the device was or who it belonged to—and certainly no obligation to leave the device on his vehicle.

An Indiana appeals court ruled against Heuring last year. But Indiana's Supreme Court seemed more sympathetic to Heuring's case during oral arguments last November.

"I'm really struggling with how is that theft," said Justice Steven David during November's oral arguments.

“We find it reckless”

Last Thursday, Indiana's highest court made it official, ruling that the search warrant that allowed police to recover Heuring's meth was illegal. The police had no more than a hunch that Heuring had removed the device, the court said, and that wasn't enough to get a search warrant.

Even if the police could have proved that Heuring had removed the device, that wouldn't prove he stole it, tRead More – Source

[contf] [contfnew]


[contfnewc] [contfnewc]

  • Diamond City Fenway Park in Boston, as seen in Apple Maps' Look Around feature. Samuel Axon
  • The US Capitol Building in Look Around.
  • The Philadelphia Museum of Art in Look Around. Samuel Axon
  • Boston Public Garden in Look Around. Samuel Axon
  • The White House in Look Around—or about as close as you can get, anyway. Samuel Axon
  • Philly's Reading Terminal Market in Look Around.
  • Harvard Square in Look Around. Samuel Axon
  • DC's Washington Monument in Look Around. Samuel Axon

Apple Maps has been slowly expanding regional coverage for its Google Street View-like Look Around feature, and now MacRumors forum members have spotted rollouts for the feature in the US cities of Philadelphia, Boston, and Washington DC.

Look Around was added as a feature in iOS 13 last September, but it launched with coverage only in or near San Francisco. Like Google Street View, the feature allows users to zoom in to street-level photography of most streets in an urban area. Apple displays Yelp listings and other data on real-world buildings and monuments in the viewport when Look Around is displayed in full screen.

Generally, we have observed that the resolution and quality of the photography is better than what we've usually seen in Google's version, and Apple applies some slick anRead More – Source

[contf] [contfnew]


[contfnewc] [contfnewc]

Enlarge / World Health Organization (WHO) Director-General Tedros Adhanom Ghebreyesus gives a press conference on the situation regarding the COVID-19 at Geneva's WHO headquarters on February 24, 2020. Getty | Fabrice Coffrini

As outbreaks of the new coronavirus flare up in several countries beyond China, experts at the World Health Organization on Monday tried to rein in fears and media speculation that the public health emergency will become a pandemic.

“I have spoken consistently about the need for facts, not fear,” WHO Director-General Tedros Adhanom Ghebreyesus said in a press briefing Monday. “Using the word pandemic now does not fit the facts, but it may certainly cause fear.”

As always, the director-general (who goes by Dr. Tedros) and his colleagues at WHO tried to shift the conversation away from speculation and worst-case scenarios. Instead, they want to focus on data and preparation. In doing so, though, Dr. Tedros noted that some of the latest figures in the epidemic are “deeply concerning.”

Since last week, officials have reported rapid increases in COVID-19 cases in several countries, namely South Korea, Iran, and Italy. As of Monday, February 24, South Korea has confirmed 763 cases and 7 deaths—a dramatic rise from the 30 cases and zero deaths it had tallied just a week ago.

The situation in Italy, likewise, went from 3 cases at the start of last week to 124 confirmed cases and two deaths Monday. Iran went from zero to 43 cases in the same period and has reported eight deaths.

The figures have led to many media reports over the weekend speculating on whether the new coronavirus outbreak is or would become a pandemic. For now, Dr. Tedros said, it is not.

“Our decision about whether to use the word pandemic to describe an epidemic is based on an ongoing assessment of the geographical spread of the virus, the severity of disease it causes and the impact it has on the whole of society,” he explained. “For the moment, we are not witnessing the uncontained global spread of this virus, and we are not witnessing large-scale severe disease or death.”

Assessing risk

Dr. Tedros summarized some of the latest data on cases and disease from China, noting that cases there are in decline and have been declining since February 2.

In Wuhan, where the outbreak began in December, the COVID-19 fatality rate appears to be between 2 percent and 4 percent. US experts have noted that this high fatality rate may partly reflect the fact that health systems in the city have been extremely overwhelmed by the outbreak and facilities have run short of medical supplies.

Outside of Wuhan, the COVID-19 fatality rate in China is approximately 0.7 percent, Dr. Tedros said. But many public health experts have suggested that even that figure may be higher than the actual fatality rate because many mild, nonfatal cases may have gone uncounted. If counted, they would dilute the death toll, leading to a lower fatality rate.

Enlarge / Katherine Johnson sits at her desk with a globe, or "Celestial Training Device." NASA

Katherine Johnson, a trailblazing mathematician best known for her contributions to NASA's human spaceflight program and who gained fame later in life due to the movie Hidden Figures, died Monday. She was 101 years old.

"At NASA, we will never forget her courage and leadership and the milestones we could not have reached without her," said NASA Administrator Jim Bridenstine. "We will continue building on her legacy and work tirelessly to increase opportunities for everyone who has something to contribute toward the ongoing work of raising the bar of human potential."

Born in rural West Virginia on August 26, 1918, Johnson showed an aptitude for mathematics early in life. “I counted everything," she said late in life of her formative years. "I counted the steps to the road, the steps up to church, the number of dishes and silverware I washed… anything that could be counted, I did."

When West Virginia decided to integrate its graduate schools in 1939, Johnson and two male students were selected as the first black students to be offered spots at the states flagship school, West Virginia University. Katherine left her teaching job and enrolled in the graduate math program.

In the wake of the Soviet Union's launch of the Sputnik spacecraft in 1957, Johnson supported some of the engineers who went on to found the Space Task Group, which morphed into NASA in 1958. At the new space agency, Johnson performed the trajectory analysis for Alan Shepard's flight in May 1961, the first time an American flew into space.

Most notably, in 1962, she performed the critical calculations that put John Glenn into a safe orbit during the first orbital mission of a US astronaut. NASA engineers had run the calculations on electric computers, but when someone was needed to validate the calculations, Glenn and the rest of the space agency turned to Johnson. “'If she says theyre good,” JohRead More – Source

[contf] [contfnew]


[contfnewc] [contfnewc]

Enlarge / Most open source projects are vastly more restrictive with their trademarks than their code. OpenBSD's Puffy, Linux's Tux, and the FSF's Meditating Gnu are among the few FOSS logos that can easily and legally be remixed and reused for simple illustrative purposes.OpenBSD, Free Software Foundation, Larry Ewing, Seattle Municipal Archives

Most people have at least heard of open source software by now—and even have a fairly good idea of what it is. Its own luminaries argue incessantly about what to call it—with camps arguing for everything from Free to Libre to Open Source and every possible combination of the above—but the one thing every expert agrees on is that it's not open source (or whatever) if it doesn't have a clearly attributed license.

You can't just publicly dump a bunch of source code without a license and say "whatever—it's there, anybody can get it." Due to the way copyright law works in most of the world, freely available code without an explicitly declared license is copyright by the author, all rights reserved. This means it's just plain unsafe to use unlicensed code, published or not—there's nothing stopping the author from coming after you and suing for royalties if you start using it.

The only way to actually make your code open source and freely available is to attach a license to it. Preferably, you want a comment with the name and version of a well-known license in the header of every file and a full copy of the license available in the root folder of your project, named LICENSE or LICENSE.TXT. This, of course, raises the question of which license to use—and why?

There are a few general types of licenses available, and we'll cover each in its own section, along with one or more prominent examples of this license type.

Default licensing—proprietary, all rights reserved

In most jurisdictions, any code or content is automatically copyrighted by the author, with all rights reserved, unless otherwise stated. While it's good form to declare the author and the copyright date in the header of any code or document, failing to do so doesn't mean the author's rights are void.

An author who makes content or code available on their own website, a Github repository, etc—either without a stated license or with an express declaration of copyright—maintains both usage and distribution rights for that code, even though it's trivially simple to view or download. If you execute that code on your own computer or computers, you're transgressing on the author's usage rights, and they may bring civil suit against you for violating their copyright, since they never granted you that right.

Similarly, if you copy that code and give it to a friend, post it on another website, sell it, or otherwise make it available anywhere beyond where the author originally posted it, you've transgressed upon the author's distribution rights, and they have standing to bring a civil suit against you.

Note that an author who maintains proprietary rights to a codebase may individually grant license to persons or organizations to use that code. Technically, you don't ever "buy" software, even when it's boxed up in a physical store. What you're actually purchasing is a license to use the software—which may or may not include physical media containing a copy of the code.

Home-grown licenses

The short version of our comment on home-grown licensing is simple: just don't do it.

There are enough well-understood, OSI-approved open source licenses in the world already that nearly any person or project should be able to find an appropriate one. Writing your own license instead means that potential users of your project, content, or code will have to do the same thing the author didn't want to—read and understand a new license from scratch.


Digital Politics is a column about the global intersection of technology and the world of politics.

As Europe lays out its grand vision for a digital future, there is at least one area where the bloc remains unrivaled — creating obstacles for itself.

When the European Commission unveiled its proposals for competing with the United States and China on everything from artificial intelligence to the data economy, President Ursula von der Leyen made it clear the 27-country bloc would do things its own way.

“We want to find European solutions for the digital age,” she told reporters in Brussels.

Those solutions include creating vast pools of data from industry to spur innovation in machine learning and other next-generation technologies — all while upholding the blocs fundamental rights like data privacy, which set the European Union apart from the worlds other powers.

Such balanced rhetoric is appealing. Amid a global techlash, who wouldnt want greater control over digital services?

The bloc is already working from a weak position compared with its global competitors in terms of money invested into technology and expertise to turn capital into global champions.

The problem is the EUs promise to embed its values into all aspects of technology is likely to hamstring its efforts to compete in the Big Leagues of tech against the U.S. and China.

And theres one reason why: Rivals wont follow its lead.

Where Europe wants to put limits on how facial recognition can be used, China is quickly rolling out a countrywide network of smart-cameras equipped with machine learning algorithms that make George Orwells “1984” look like a kids nursery rhyme.


For sale: Exploding power banks, toxic toys and faulty smoke alarms.

These products — all banned in the European Union — can be purchased with a click of a button from large online marketplaces such as Amazon, AliExpress, eBay and Wish, according to a study carried out by consumer organizations in Belgium, Italy, the Netherlands, Denmark, Germany and the U.K.

Of the 250 products bought from popular online shops, including electronics, toys and cosmetics, two-thirds failed the EUs product-safety rules.

Some products, such as toys with loose parts or childrens hoodies with long cords that pose a choking hazard, were visibly unsafe. Others had up to 200 times more dangerous chemicals than allowed, while smoke and carbon monoxide alarms failed to detect deadly amounts of gas or smoke.

Three out of 4 tested USB chargers, travel adaptors and power banks failed electrical safety tests. Most of them were cheap and unbranded products.

Under current laws, platforms are required to removed listings of unsafe items once flagged “expeditiously.” They are under no obligation to search for faulty or dangerous products proactively. Consumer groups have for years called for platforms to take more responsibility for the products sold on their marketplaces through third-party sellers.

British consumer group Which? first found unsafe child car safety seats sold on Amazon in 2014. Amazon promptly took down the product listings. Then the car seats cropped again in spot checks in 2017 and 2019. In February, a BBC investigation found similar car seats on the platform yet again.

Regulators are finally starting to listen.

The current European consumer agenda from 2012 is woefully outdated when it comes to online shopping. The European Commissioner for Justice and Consumers Didier Reynders announced he would update the EUs consumer agenda later this year.

In 2018, U.S. companies Amazon and eBay, Chinas Alibaba and Japans Rakuten signed a voluntary pledge with the European Commission to guarantee the safety of non-food consumer products offered by third-party sellers. Polish online marketplace Allegro and French Cdiscount joined the group in 2020.

Enlarge / Wok tossing has long been suspected of causing the high shoulder injury rate among Chinese chefs.Hunting Ko and David Hu/Georgia Tech

Fried rice is a classic dish in pretty much every Chinese restaurant, and the strenuous process of tossing the rice in a wok over high heat is key to producing the perfect final product. There's always chemistry involved in cooking, but there's also a fair amount of physics. Scientists at the Georgia Institute of Technology have devised a model for the kinematics of wok-tossing to explain how it produces fried rice that is nicely browned, but not burnt. They described their work in a recent paper published in the Journal of the Royal Society: Interface.

This work hails from David Hu's lab at Georgia Tech, known for investigating such diverse phenomena as the collective behavior of fire ants, water striders, snakes, various climbing insects, mosquitos, the unique properties of cat tongues, and animal bodily functions like urination and defecation—including a 2019 Ig Nobel Prize winning study on why wombats produce cubed poo. Hu and his graduate student, Hungtang Ko—also a co-author on a 2019 paper on the physics of how fire ants band together to build rafts—discovered they shared a common interest in the physics of cooking, particularly Chinese stir-fry.

Hu and Ko chose to focus their investigation on fried rice (or "scattered golden rice"), a classic dish dating back some 1500 years. According to the authors, tossing the ingredients in the wok while stir-frying ensures that the dish is browned, but not burned. Something about this cooking process creates the so-called "Maillard reaction": the chemical interaction of amino acids and carbohydrates subjected to high heat that is responsible for the browning of meats, for instance.

But woks are heavy, and the constant tossing can take its toll on Chinese chefs, some 64 percent of whom report chronic shoulder pain, among other ailments. Hu and Ko thought that a better understanding of the underlying kinematics of the process might one day lead to fewer wok-related injuries for chefs.

In the summers of 2018 and 2019, Ko and Hu filmed five chefs from stir-fry restaurants in Taiwan and China, cooking fried rice, and then extracted frequency data from that footage. (They had to explain to patrons that the recording was for science, and that they were not making a television show.) It typically takes about two minutes to prepare the dish, including sporadic wok-tossing—some 276 tossing cycles in all, each lasting about one-third of a second.

  • Image sequence showing the wok-tossing process, modeled as a two-link pendulum. Hungtang Ko and David Hu/Georgia Tech
  • Schematic diagram showing mathematical model for wok-tossing. Hungtang Ko and David Hu/Georgia Tech
  • Graph showing the metrics for evaluating wok tosses. Hungtang Ko and David Hu/Georgia Tech

Ko and Hu presented preliminary results of their experiments at a 2018 meeting of the American Physical Society's Division of Fluid Dynamics, publishing the complete analysis in this latest paper. They were able to model the wok's motion with just two variables, akin to a two-link pendulum, since chefs typically don't lift the wok off the stove, maintaining "a single sliding point of contact," they wrote. Their model predicted the trajectory of the rice based on projectile motion, using three metrics: the proportion of the rice being tossed, how high it was tossed, and its angular displacement.

The authors found two distinct stages of wok-tossing: pushing the wok forward and rotating it clockwise to catch rice as it falls; and pulling the woRead More – Source

[contf] [contfnew]


[contfnewc] [contfnewc]

Enlarge / The top floor of our test house is relatively straightforward—although like many houses, it suffers from terrible router placement nowhere near its center.Jim Salter

Here at Ars, we've spent a lot of time covering how Wi-Fi works, which kits perform the best, and how upcoming standards will affect you. Today, we're going to go a little more basic: we're going to teach you how to figure out how many Wi-Fi access points (APs) you need, and where to put them.

These rules apply whether we're talking about a single Wi-Fi router, a mesh kit like Eero, Plume, or Orbi, or a set of wire-backhauled access points like Ubiquiti's UAP-AC line or TP-Link's EAPs. Unfortunately, these "rules" are necessarily closer to "guidelines" as there are a lot of variables it's impossible to fully account for from an armchair a few thousand miles away. But if you become familiar with these rules, you should at least walk away with a better practical understanding of what to expect—and not expect—from your Wi-Fi gear and how to get the most out of it.

Before we get started

Let's go over one bit of RF theory (radio-frequency) before we get started on our ten rules—some of them will make much better sense if you understand how RF signal strength is measured and how it attenuates over distance and through obstacles.

Note: some RF engineers recommend -65dBM as the lowest signal level for maximum performance.
Enlarge / Note: some RF engineers recommend -65dBM as the lowest signal level for maximum performance.Jim Salter

The above graph gives us some simple free space loss curves for Wi-Fi frequencies. The most important thing to understand here is what the units actually mean: dBM convert directly to milliwatts, but on a logarithmic base ten scale. For each 10dBM drop, the actual signal strength in milliwatts drops by a factor of ten. -10dBM is 0.1mW, -20dBM is 0.01mW, and so forth.

The logarithmic scale makes it possible to measure signal loss additively, rather than multiplicably. Each doubling of distance drops the signal by 6dBM, as we can clearly see when we look at the bold red 2.4GHz curve: at 1m distance, the signal is -40dBM; at 2m, it's -46dBM, and at 4m it's down to -52dBM.

Walls and other obstructions—including but not limited to human bodies, cabinets and furniture, and appliances—will attenuate the signal further. A good rule of thumb is -3dBM for each additional wall or other significant obstruction, which we'll talk more about later. You can see additional curves plotted above in finer lines for the same distances including one or two additional walls (or other obstacles).

While you should ideally have signal levels no lower than -67dBM, you shouldn't fret about trying to get them much higher than that—typically, there's no real performance difference between a blazing-hot -40dBM and a considerably-cooler -65dBM, as far away from one another on a chart as they may seem. There's a lot more going on with Wi-Fi than just raw signal strength; as long as you exceed that minimum, it doesn't really matter how much you exceed it by.

In fact, too hot of a signal can be as much of a problem as too cold—many a forum user has complained for pages about low speed test results, until finally some wise head asks "did you put your device right next to the access point? Move it a meter or two away, and try again." Sure enough, the "problem" resolves itself.

Rule 1: No more than two rooms and two walls

Our first rule for access point placement is no more than two rooms and two interior walls between access points and devices, if possible. This is a pretty fudge-y rule, because different rooms are shaped and sized differently, and different houses have different wall structures—but it's a good starting point, and it will serve you well in typically-sized houses and apartments with standard, reasonably modern sheet rock interior wall construction.

"Typically-sized," at least in most of the USA, means bedrooms about three or four meters per side and larger living areas up to five or six meters per side. If we take nine meters as the average linear distance covering "two rooms" in a straight line, and add in two interior walls at -3dBM apiece, our RF loss curve shows us that 2.4GHz signals are doing fantastic at -65dBM. 5GHz, not so much—if we need a full nine meters and two full walls, we're down to -72dBM at 5GHz. This is certainly enough to get a connection, but it's not great. In real life, a device at -72dBM on 5GHz will likely see around the same raw throughput as one at -65dBM on 2.4GHz—but the technically slower 2.4GHz connection will tend to be more reliable and exhibit consistently lower latency.

Enlarge / The shores of Lake Mead, faded from previously higher water levels.Chris Richards / Flickr

In 2014, the Colorado River reached the ocean for the first time in 16 years. Most years, the river doesn't make it that far because it has been dammed and diverted along the way, supplying fresh water to approximately 40 million people and supporting agriculture and economic activity in the dry Southwestern United States.

As climate change disrupts historical patterns of rainfall and temperature, the Colorado River has not been faring well, and it's getting even increasingly unlikely that the river will reach the sea again. A paper published this week in Science reports that the river's flow has been declining by an alarming 9.3 percent for every 1°C of warming—and that declining snow levels are the main culprit for this dramatic decline.

Some history

For a resource as critical and carefully managed as the Colorado River, precision is key. Just knowing that it's declining in response to climate change is not enough; more crucial is knowing how much that decline is likely to be.

But figuring out how much a river's flow is likely to decline is not a simple task. Climate change alters all sorts of variables, from the actual air temperature to how much precipitation falls and whether it falls as snow or rain. Because these factors all feed into each other, researchers hit on different estimates for how rivers around the world will change in the face of warming. These discrepancies, write US Geological Survey researchers P. Chris Milly and Krista Dunne, lead to a huge amount of uncertainty for our understanding of how water shortages will affect "human livelihood, economic activity, and ecosystem health."

To get a better handle on how warming will affect the Colorado River, Milly and Dunne first looked backwards. They used historical data going back to 1912 to build a computer simulation of the river, dividing its vast length into hundreds of sub-areas with unique features, like different topography and rainfall.

The simulation allowed them to work out how different climate factors affected the river's flow. They hit on one factor in particular that was playing an important role: reduced snow cover, which leads to more evaporation. Less snow means more dark ground is exposed and absorbing heat, instead of being covered in white stuff that reflects light. The warmer ground means higher rates of evaporation and, thus, less water in the river.

Snow cover is a "protective shield," write Milly and Dunne. And that shield is slowly being lost.

A dry future

What does this mean for the future of the river and the people who depend on it? To figure that out, Milly and Dunne looked at a range of climate models that predict how global temperatures will change in future, using scenarios that depend on how well we do at curbing emissions.

Under a "business as usual" scenario, their model predicted tRead More – Source

[contf] [contfnew]


[contfnewc] [contfnewc]