Solar Team Eindhoven presented Stella Lux, an intelligent, solar-powered family car that generates more power than it uses. Read all about it + photos + specs.
Generating more power than you use is very beneficial in an electric car, as it means you’ll be able to charge the batteries while driving – that is, the driving will not actually use any battery charge as long as you have enough sunlight. So you only use the battery charge at other times: seriously cloudy spells, or night. It also means that it generates so much that even with less sun lots of power is generated (= high efficiency).
Extremely useful.
Out of everyone who ever existed, who has done the most good for humanity? It’s a difficult question.
[…]
In 1958, Viktor Zhdanov was a deputy minister of health for the Soviet Union. In May of that year, at the Eleventh World Health Assembly meeting in Minneapolis, Minnesota, during the Soviet Union’s first appearance in the assembly after a nine-year absence, Zhdanov presented a lengthy report with a visionary plan to eradicate smallpox. At the time, no disease had ever before been eradicated. No one knew if it could even be done. And no one expected such a suggestion to come from the Soviet Union; in fact, Zhdanov had had to fight internal pressure from the USSR to convince them of his plans. When he spoke to the assembly of the WHO, he conveyed his message with passion, conviction, and optimism, boldly suggesting that the disease could be eradicated within ten years.
The Brisbane astrobiologist at the forefront of NASA’s next mission to Mars has one regret in her stellar career – that she could not lead the charge to discover evidence of extraterrestrial life from her own country.
Abigail Allwood, the co-leader of the coming Mars 2020 rover mission, said Australia would continue to lose its best and brightest minds if it did not embrace one of the most awe-inspiring of scientific fields.
“It’s a little bit sad, for me, to see that when I finished my degree here in Australia, I couldn’t pursue the kind of things I wanted to do in Australia at all,” she said.
“There’s very little involvement in space exploration.
“We don’t have a formal space agency, which makes it very difficult for us to participate in opportunities like this and, to me, it belies our capability.
“We produce so many bright graduates. We have a fantastic education system producing engineering, science technology and mathematics graduates and the sorts of things that really inspire them, like space exploration, is not possible to do here in Australia.”
Dr Allwood, who was at the Queensland University of Technology on Thursday to accept an outstanding alumnus award from the science and engineering faculty, said Australia had the capability to be a leader in space exploration.
But the nation’s involvement in humanity’s great exploratory frontier was “less than it could be”.
“There are some incredible Australian scientists overseas who want to come back and work here, if they had the similar opportunities back here that they do overseas,” Dr Allwood said.
We need to ban offensive autonomous weapons – or ‘killer robots’ – before a new arms race to produce them begins.
More than 1,000 of the leading researchers in artificial intelligence (AI) and robotics have today signed and published an open letter calling for a ban on offensive autonomous weapons, also known colloquially as “killer robots”.
The letter has also been signed by many technologists and experts, including SpaceX and Tesla CEO Elon Musk, physicist Stephen Hawking, Apple co-founder Steve Wozniak, Skype co-founder Jaan Talinn and linguist and activist Noam Chomsky.
[…]
A press conference releasing the open letter to the public will be held at the opening of the International Joint Conference on AI at 9pm AEST, July 28, 2015. To watch the streaming of the press conference on Periscope (live or for the next 24 hours), follow @TobyWalsh on Twitter for notification of the stream.
The following is the entire text of the open letter:
Autonomous weapons select and engage targets without human intervention. They might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions. Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is – practically if not legally – feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.
Many arguments have been made for and against autonomous weapons, for example that replacing human soldiers by machines is good by reducing casualties for the owner but bad by thereby lowering the threshold for going to battle. The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow. Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce. It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity. There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people.
Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons — and do not want others to tarnish their field by doing so, potentially creating a major public backlash against AI that curtails its future societal benefits. Indeed, chemists and biologists have broadly supported international agreements that have successfully prohibited chemical and biological weapons, just as most physicists supported the treaties banning space-based nuclear weapons and blinding laser weapons.
In summary, we believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so. Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.
Brady Haran of Objectivity Videos visits Sir Paul Nurse, President of the Royal Society to discuss the Society’s “Philosophical Transactions” 350th anniversary: The world’s first scientific journal. So its first edition is from 1665.
It shows people’s curiosity, sharing information and having it reviewed both before and after publication, building on each other’s prior work, and so on. Different times, but good science.
The first species of yeti crab from hydrothermal vent systems of the East Scotia Ridge in the Southern Ocean, Antarctica, has been described. This Yeti Crab is famous for its body, which is densely covered by bristles — known as setae — and bacteria, giving it a fur-like appearance.
Since implementing this program I've really noticed how the students are improving.
Trent Perry, Teacher