WebProNews

Tag: Computers

  • Intel Is Getting Out Of The Motherboard Business

    According to multiple reports, Intel is exiting its desktop motherboards business after about two decades of making them. According to PCWorld, who spoke with the company, Intel will pull out as soon as its next-generation Haswell CPU ships, and will dissolve the business over the course of the next three years.

    Forbes, who received an email from Intel, reports:

    It’s an announcement with layers of ramifications, but it’s certainly not a death bell. The tried-and-true tower PC is a fixture among business and home users alike. While companies like Dell and HP have traditionally relied on Intel for motherboards in their low-to-mid-range desktops, there are several other companies happy to carry the torch. Just ask any power user or gaming enthusiast if they think ASUS or MSI will be elated to hear this news.

    This is a forward-thinking move for Intel, who also said “The internal talent and experience of twenty years in the boards business…is being redistributed to address emerging new form factors.” Those form factors are springing up from all corners, inspired by Windows 8 and consumers shifting their needs to devices that serve multiple purposes in multiple environments. Microsoft‘s Surface Pro and the TAICHI from ASUS are perfect examples of where the post-PC market is heading.

    In other news, Intel has declared d a 22.5 cents-per-share (90 cents-per-share on an annual basis) quarterly dividend on the company’s common stock. It will be payable on March 1 to stockholders of record on February 7.

    CEO Paul Otellini said, “With the payout of this quarterly dividend, Intel’s dividend and stock buyback program will have returned approximately $119 billion to stockholders since the program’s inception. This is a testament to our commitment to return cash to our stockholders as we continue to generate strong cash flow driven by the combination of new products and design wins from the lowest power portable devices to the most powerful data center servers.”

  • IBM’s Watson Cursed Like a Sailor After Being Taught the Urban Dictionary

    IBM’s Watson Cursed Like a Sailor After Being Taught the Urban Dictionary

    It’s a fact that Watson, IBM’s massive AI project, is smarter than the average human. I mean, it kicked Ken Jennings’ ass on Jeopardy that one time. “Smart,” in that respect, meant the ability to pull knowledge from terabytes worth of Wikipedia data based on verbal clues.

    But Ken Jennings (and you and me) has Watson beat in one measure of intelligence: human language. Once that fact no longer holds true, well, we’re all in a hell of a lot of trouble.

    Nevertheless, IBM is trying to improve Watson’s human language prowess. And to do that, Watson needs to understand how humans talk – how they really talk. I’m talking about slang, of course. People simply don’t realize just how complicated human language really is. Teaching Watson proper and direct English is nowhere near good enough to turn it into a fully functional conversation partner. I mean, how the hell is it going to know how to respond to YOLO?

    So, to work on that slang element of human language, IBM researchers decided to teach Watson the Urban Dictionary – you know, the online database of anything and everything human begins say – from the inane to the foul.

    Apparently, this led to a problem. Watson developed a mouth like a sailor. From Fortune:

    Watson couldn’t distinguish between polite language and profanity — which the Urban Dictionary is full of. Watson picked up some bad habits from reading Wikipedia as well. In tests it even used the word “bullshit” in an answer to a researcher’s query.

    Ultimately, Brown’s 35-person team developed a filter to keep Watson from swearing and scraped the Urban Dictionary from its memory. But the trial proves just how thorny it will be to get artificial intelligence to communicate naturally. Brown is now training Watson as a diagnostic tool for hospitals. No knowledge of OMG required.

    Suck it, Trebek.

    [Fortune via The Atlantic]

  • New ThinkPad X1 Carbon From Lenovo To Be Available This Month

    Lenovo introduced the ThinkPad X1 Carbon back in May, and the device will reportedly ship by the end of the month. No concrete release date has been announced, but it should be available soon enough, through Lenovo’s partners and Lenovo.com. On the product page, it still just says “available summer 2012”.

    A commenter at SlashGear indicates that a Lenovo sales rep told him the date would be August 21st.

    The ThinkPad X1 Carbon is the lightest 14″ Ultrabook in the world, according to Lenovo.

    “Created for those who demand the highest levels of performance, mobility, entertainment and design, the ThinkPad X1 Carbon exceeds Ultrabook specifications by using a premium carbon fiber rollcage to create a durable Ultrabook weighing less than three pounds,” the company says.

    “Just like in the automotive industry, carbon fiber materials add structural strength and durability while minimizing weight,” the company adds on the product page.

    It comes with embedded 3G connectivity and a mobile pay-as-you go option, and contains Intel vPro technology.

    Lenovo also touts the fact that you can bring up the battery life of the device to 80% strength in as little as 35 minutes with its RapidCharge feature.

    When the company introduced the X1 Carbon, it also revealed a new portfolio of ThinkPad laptops, the new ThinkPad X, T, W, and L Series models, which come with 3rd generation Intel Core processors and Lenovo Enhanced Experience 3.0. The laptops are 40% faster than typical Windows 7 computers, according to the company. These came out in June.

  • Computer Screens Are Killing Our Eyes Because We Work Too Hard

    Computer Screens Are Killing Our Eyes Because We Work Too Hard

    I think we all know the negative consequences of staring at a screen for too long. It can lead to premature blindness and other eye problems. The only problem is that almost every job today requires a significant amount of time in front of a computer monitor. That screen time is killing our eyes according to a report from the American Academy of Optometry.

    So what should we do about it? Health experts say to reduce our time looking at screens to less than two hours a day. They do, of course, provide the caveat that it’s fine when looking at a screen for work. They do say that you really should get out more while focusing your vision on books and other things.

    Funny enough, the biggest problem that affects our eyes is being productive. People who focus entirely on their work forget some basic eye care like blinking. The average person blinks about 12 to 15 times a minute. When focused on work, a person will blink only four to five times a minute. Since blinking is such an subconscious action, it’s hard to actually force yourself to blink without taking your mind off the current work at hand.

    For better eye health, be sure to sit far away from the screen in a well-lit room. Both of these factors can contribue to poor eye health. It’s hard to sit far away from a computer while at work, but you can distance yourself from the screen. Be sure to keep the lights in the room on as well. The backlight of the computer screen coupled with a dark room is a recipe for early onset blindness.

    I should admit that I don’t wear glasses and I’ve been staring at screens for the better part of 20 years now. My eyes don’t even appear to be getting worse with age, but I know other people who have problems. Ignoring the fact that I’m a mutant, taking care of your eyes in today’s digital age is super important. Until doctors are able to stab us in the eyes with needles to cure various vision problems, it’s best to take care of them.

    [h/t: A Healthier Michigan]

  • If You Spend a Lot of Time Online, You Probably Have a Mental Disorder

    Never mind ruining your eyesight, alienating your friends, or developing a serious Vitamin D deficiency — apparently spending too much time in front of a computer screen could lead to serious mental health disorders. According to a recent study perpetrated by a group of researchers at the University of Gothenburg in Sweden, devoting a large section of your life to online shenanigans could result in stress, sleep deprivation, and depression.

    Of course, it’s not so much the act of sitting in front of the computer that causes these problems as it is the time you waste doing so. Sara Thomee, lead author of this revealing study, explains that, when you’re devoting that much of your life towards one particular thing, you essentially allow other things to fall by the wayside, thus creating the unwanted stress.

    “High quantitative use was a central link between computer use and stress, sleep disturbances, and depression, described by the young adults,” Thomee explained in the study. “It was easy to spend more time than planned at the computer (e.g., working, gaming, or chatting), and this tended to lead to time pressure, neglect of other activities and personal needs (such as social interaction, sleep, physical activity), as well as bad ergonomics, and mental overload.”

    Spend a lot of time on the phone? You’re at risk, as well. Being available 24/7, for whatever reason, can often lead to feelings of being trapped, of “never being free” from those who wish to monopolize all of your time. And when you skip out on not returning calls or checking your voicemail right away, feelings of guilt can set in.

    Last, but certainly not least, are the video gamers. They, too, are at risk for depression and sleep reduction, not to mentioned a reduced and underperforming libido. “Daily computer gaming for 1–2 hours meant an increased risk for symptoms of depression in the women. Often using the computer late at night (and consequently losing sleep) was a prospective risk factor for stress and sleep disturbances, including reduced performance, in both sexes,” the study revealed.

    Once again, folks, we’re at that point where we simply cannot moderate our activities. Be it eating or computing or gaming or chatting — we’re a gluttonous lot, and our bodies and minds ultimately pay the price for our behavior. So the next time you find yourself wandering around Skyrim for hours at a time, perhaps you should take a break and do something else for a little while. Give your brain a break, treat your legs to some exercise.

    As David Byrne used to say, “It might do you some good”.

  • World’s Most Popular Screen Resolution Now At 1366×768

    There’s something to be said about having a great screen resolution. I use a 16:10 monitor at 1680×1050 with my PC. Not exactly the enthusiast golden standard of 1920×1080 but it serves me well and looks great to boot. That’s why all the more distressing that most people in the world don’t really care about resolution as much as I do.

    For the longest time, people have been content with a paltry 1024×768 resolution. For the first time in three years, 1024×768 has finally fallen behind a competing resolution. The new most popular resolution worldwide is now a better, but still really small, 1366×768. This information comes from StatCounter who has been keeping track of global screen resolution trends since 2009. Surely this isn’t that important though, right? Wrong!

    “The data reflects a continuing trend of users moving to larger screen resolution sizes,” commented Aodhan Cullen, CEO, StatCounter. “The screen resolution size people are using is a critical factor for developers when it comes to web design, particularly in the case of fixed width web pages.”

    How do the actual numbers work out though? 1024×768 has decreased in global use to only 18.6 percent, while 1366×768 has risen above it to 19.28 percent. It’s not a big difference, but it’s significant for the Web development ecosystem.

    Source: StatCounter Global Stats – Screen Resolution Market Share

    On a similar note, the move from 1024×768 to 1366×768 shows a change in aspect ratio as well. The new standard is a resolution at 16:9. This means that more and more people are switching over to widescreen. Hmm… I wonder, what popular social media service drew the ire of users when it switched its design to a non-widescreen friendly version full of whitespace? Companies like Google need this kind of information the most so they can take advantage of that whitespace for the increased amount of people using wide-screen monitors.

    I was curious to see if this trend was being influenced by the types of PCs being sold. I highly doubt that many desktop monitors are being sold at that size, so the main culprit must be the all powerful laptop. Tablets don’t really come into question since the iPad 2 is at a 1024×768 resolution and the new iPad has a 4:3 aspect ratio.

    Looking at Newegg, the selection of laptops with a 1024×768 resolution is all but dried out. While there are other resolutions to choose from, the most bountiful selection is, you guessed it, 1366×768 with 378 laptops to choose from. It seems that hardware manufacturers have been pushing this as the new standard in entry level cheap laptops and it’s working.

    To recap, Web developers need to start building their applications with these resolutions in mind. While most will build their Web applications to take advantage of a large variety of resolutions, it’s good to keep the standard in mind.

    Are you still a 1024×768 diehard? Have you upgraded to 1366×768? Or are you like me and have an HD resolution? Let us know in the comments.

    [Lead Image: StatCounter]

  • With Internal Routing, Chips Could Function as Mini “Internets”

    Computer chips have stopped getting faster. In order to keep increasing chips’ computational power at the rate to which we’ve grown accustomed, chipmakers are instead giving them additional “cores,” or processing units. Today, a typical chip might have six or eight cores, all communicating with each other over a single bundle of wires, called a bus. With a bus, however, only one pair of cores can talk at a time, which would be a serious limitation in chips with hundreds or even thousands of cores, which many electrical engineers envision as the future of computing.

    Li-Shiuan Peh, an associate professor of electrical engineering and computer science at MIT, wants cores to communicate the same way computers hooked to the Internet do: by bundling the information they transmit into “packets.” Each core would have its own router, which could send a packet down any of several paths, depending on the condition of the network as a whole.

    In principle, multicore chips are faster than single-core chips because they can split up computational tasks and run them on several cores at once. Cores working on the same task will occasionally need to share data, but until recently, the core count on commercial chips has been low enough that a single bus has been able to handle the extra communication load. That’s already changing, however: “Buses have hit a limit,” Peh says. “They typically scale to about eight cores.” The 10-core chips found in high-end servers frequently add a second bus, but that approach won’t work for chips with hundreds of cores.

    For one thing, Peh says, “buses take up a lot of power, because they are trying to drive long wires to eight or 10 cores at the same time.” In the type of network Peh is proposing, on the other hand, each core communicates only with the four cores nearest it. “Here, you’re driving short segments of wires, so that allows you to go lower in voltage,” she explains.

    In an on-chip network, however, a packet of data traveling from one core to another has to stop at every router in between. Moreover, if two packets arrive at a router at the same time, one of them has to be stored in memory while the router handles the other. Many engineers, Peh says, worry that these added requirements will introduce enough delays and computational complexity to offset the advantages of packet switching. “The biggest problem, I think, is that in industry right now, people don’t know how to build these networks, because it has been buses for decades,” Peh says.

    Peh and her colleagues have developed two techniques to address these concerns. One is something they call “virtual bypassing.” In the Internet, when a packet arrives at a router, the router inspects its addressing information before deciding which path to send it down. With virtual bypassing, however, each router sends an advance signal to the next, so that it can preset its switch, speeding the packet on with no additional computation. In her group’s test chips, Peh says, virtual bypassing allowed a very close approach to the maximum data-transmission rates predicted by theoretical analysis.

    The other technique is something called low-swing signaling. Digital data consists of ones and zeroes, which are transmitted over communications channels as high and low voltages. Sunghyun Park, a PhD student advised by both Peh and Anantha Chandrakasan, the Joseph F. and Nancy P. Keithley Professor of Electrical Engineering, developed a circuit that reduces the swing between the high and low voltages from one volt to 300 millivolts. With its combination of virtual bypassing and low-swing signaling, the researchers’ test chip consumed 38 percent less energy than previous packet-switched test chips. The researchers have more work to do, Peh says, before their test chip’s power consumption gets as close to the theoretical limit as its data transmission rate does. But, she adds, “if we compare it against a bus, we get orders-of-magnitude savings.”

  • Jack Tramiel, Founder Of Commodore International, Dies At 83

    Computers, and by extension video games, are a relatively new technology. This is why most of the people who helped pave the way for these new-ish technologies are still alive. The sad truth, however, is that many of these pioneers are getting on in age. It’s with this in mind that we report the loss of a true legend and pioneer today.

    Jack Tramiel, as reported by Forbes, passed away Sunday at the age of 83. You may not know Tramiel by name, but you surely know his company and legacy. He was the founder of Commodore International, the company that created the famous Commodore 64. It’s because of this that his influence on computing and gaming can not be understated.

    Tramiel’s rise to fame is even more impressive when you learn that he was a holocaust survivor. He was born to a Jewish family in Poland on December 13, 1928. In 1939, he was moved with his father to a labor camp where his father died. He was rescued from the camp in 1945 and moved to the U.S. in 1947.

    It was his time in the U.S. that he began to enter the technology field. He started his first company, Commodore Portable Typewriter, in 1953 that repaired office machinery. After a less than successful venture into the adding machine business and calculators, Commodore finally entered into the home computer market with the Commodore PET.

    While the PET was a success for the company, their first major break away success was the Commodore 64 in 1982. It went on to become the best-selling home computer of all time with 17 million units sold. After this success, Tramiel resigned from Commodore to form Tramel Technology in 1984. It was here that he bought a dead-in-the-water Atari Inc. from Warner Communications after the video game market crash of 1983. Tramel Technology was renamed Atari Corporation and the rest is history.

    All of is to say that Jack Tramiel helped shape the future of computing and the industry it spawned. It was his mantra, “We need to build computers for the masses, not the classes,” that inspired the idea of the consumer computer. The Commodore 64 is a testament to that and the continued consumer support for personal computers, whether they be a desktop, laptop or tablet, continue the idea of “computers for the masses.”

    The Twitter reaction has been heartfelt with fans of Commodore expressing their love of the company and the man who founded it:

    RIP Jack Tramiel. This LOAD”*”,8,1 is for you.(image) 2 minutes ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    RIP Jack Tramiel, the father of Commodore. Without him, no Speedball, no ProTracker mods, no SWOS, no Red Sector megademo. A sad day indeed.(image) 5 minutes ago via Twitter for iPhone ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Jack Tramiel C= Founder & Atari Corp owner (1984-96) has died.His “computing for the masses, not the classes” phlosophy changed the industry(image) 10 minutes ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Computer Legend and Gaming Industry Pioneer, Jack Tramiel Passes away at 83. He’s had quite the journey and should be an inspiration to all.(image) 13 minutes ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    RIP Jack Tramiel – you invented my first computer and changed my whole life by doing this. Thanks.(image) 16 minutes ago via Echofon ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    RIP dear Jack Tramiel. Commodore-64, Commodore-128, and VIC-20 gave easy-to-use and frugal gateways to gaming and online groups.(image) 17 minutes ago via Twitter for iPhone ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Very, very sad to hear about the death of Jack Tramiel. Founder of Commodore and then chief of Atari. An Auschwitz survivor, too. Legend.(image) 1 minute ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Jack Tramiel dies and gamers the world over say “Who?”. Utterly tragic. RIP, Jack. Games industry owes you a massive debt.(image) 4 minutes ago via HTC Peep ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    I think that the computer industry is at the age where the founders are old and starting to die, makes me sad and proud. RIP Jack Tramiel.(image) 51 seconds ago via TweetDeck ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Goodbye Jack Tramiel, your VIC-20 and Commodore 64 shaped my future, helped me learn computers, and have a lot of fun playing games!(image) 1 minute ago via HootSuite ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    We here at WebProNews salute Jack Tramiel. His contributions to the computer and gaming industries make our jobs more interesting and life more fun for everybody.

  • Finally, A Better Way To Cool Computers

    A North Carolina State University researcher has developed a more efficient, less expensive way of cooling electronic devices – particularly devices that generate a lot of heat, such as lasers and power devices.

    The technique uses a “heat spreader” made of a copper-graphene composite, which is attached to the electronic device using an indium-graphene interface film “Both the copper-graphene and indium-graphene have higher thermal conductivity, allowing the device to cool efficiently,” says Dr. Jag Kasichainula, an associate professor of materials science and engineering at NC State and author of a paper on the research. Thermal conductivity is the rate at which a material conducts heat.

    In fact, Kasichainula found that the copper-graphene film’s thermal conductivity allows it to cool approximately 25 percent faster than pure copper, which is what most devices currently use.

    Dissipating heat from electronic devices is important, because the devices become unreliable when they become too hot.

    The paper also lays out the manufacturing process for creating the copper-graphene composite, using an electrochemical deposition process. “The copper-graphene composite is also low-cost and easy to produce,” Kasichainula says. “Copper is expensive, so replacing some of the copper with graphene actually lowers the overall cost.”

  • Quantum Computer Built Inside a Diamond

    Quantum Computer Built Inside a Diamond

    Diamonds are forever – or, at least, the effects of this diamond on quantum computing may be. A team that includes scientists from USC has built a quantum computer in a diamond, the first of its kind to include protection against “decoherence” – noise that prevents the computer from functioning properly.

    The demonstration shows the viability of solid-state quantum computers, which – unlike earlier gas- and liquid-state systems – may represent the future of quantum computing because they can be easily scaled up in size. Current quantum computers are typically very small and – though impressive – cannot yet compete with the speed of larger, traditional computers.

    The multinational team included USC Professor Daniel Lidar and USC postdoctoral researcher Zhihui Wang, as well as researchers from the Delft University of Technology in the Netherlands, Iowa State University and the University of California, Santa Barbara. Their findings will be published on April 5 in Nature.

    The team’s diamond quantum computer system featured two quantum bits (called “qubits”), made of subatomic particles. As opposed to traditional computer bits, which can encode distinctly either a one or a zero, qubits can encode a one and a zero at the same time. This property, called superposition, along with the ability of quantum states to “tunnel” through energy barriers, will some day allow quantum computers to perform optimization calculations much faster than traditional computers.

    Like all diamonds, the diamond used by the researchers has impurities – things other than carbon. The more impurities in a diamond, the less attractive it is as a piece of jewelry, because it makes the crystal appear cloudy. The team, however, utilized the impurities themselves. A rogue nitrogen nucleus became the first qubit. In a second flaw sat an electron, which became the second qubit. (Though put more accurately, the “spin” of each of these subatomic particles was used as the qubit.) Electrons are smaller than nuclei and perform computations much more quickly, but also fall victim more quickly to “decoherence.” A qubit based on a nucleus, which is large, is much more stable but slower.

    “A nucleus has a long decoherence time – in the milliseconds. You can think of it as very sluggish,” said Lidar, who holds a joint appointment with the USC Viterbi School of Engineering and the USC Dornsife College of Letters, Arts and Sciences.

    Though solid-state computing systems have existed before, this was the first to incorporate decoherence protection – using microwave pulses to continually switch the direction of the electron spin rotation.

    “It’s a little like time travel,” Lidar said, because switching the direction of rotation time-reverses the inconsistencies in motion as the qubits move back to their original position.

    The team was able to demonstrate that their diamond-encased system does indeed operate in a quantum fashion by seeing how closely it matched “Grover’s algorithm.” The algorithm is not new – Lov Grover of Bell Labs invented it in 1996 – but it shows the promise of quantum computing. The test is a search of an unsorted database, akin to being told to search for a name in a phone book when you’ve only been given the phone number. Sometimes you’d miraculously find it on the first try, other times you might have to search through the entire book to find it. If you did the search countless times, on average, you’d find the name you were looking for after searching through half of the phone book. Mathematically, this can be expressed by saying you’d find the correct choice in X/2 tries – if X is the number of total choices you have to search through. So, with four choices total, you’ll find the correct one after two tries on average. A quantum computer, using the properties of superposition, can find the correct choice much more quickly. The mathematics behind it are complicated, but in practical terms, a quantum computer searching through an unsorted list of four choices will find the correct choice on the first try, every time. Though not perfect, the new computer picked the correct choice on the first try about 95 percent of the time – enough to demonstrate that it operates in a quantum fashion.

  • Scientists Use Babies To Make Computers Smarter

    Scientists Use Babies To Make Computers Smarter

    People often wonder if computers make children smarter. Scientists at the University of California, Berkeley, are asking the reverse question: Can children make computers smarter? And the answer appears to be ‘yes.’

    UC Berkeley researchers are tapping the cognitive smarts of babies, toddlers and preschoolers to program computers to think more like humans.

    If replicated in machines, the computational models based on baby brainpower could give a major boost to artificial intelligence, which historically has had difficulty handling nuances and uncertainty, researchers said

    “Children are the greatest learning machines in the universe. Imagine if computers could learn as much and as quickly as they do,” said Alison Gopnik a developmental psychologist at UC Berkeley and author of “The Scientist in the Crib” and “The Philosophical Baby.”

    In a wide range of experiments involving lollipops, flashing and spinning toys, and music makers, among other props, UC Berkeley researchers are finding that children – at younger and younger ages – are testing hypotheses, detecting statistical patterns and drawing conclusions while constantly adapting to changes.

    “Young children are capable of solving problems that still pose a challenge for computers, such as learning languages and figuring out causal relationships,” said Tom Griffiths, director of UC Berkeley’s Computational Cognitive Science Lab. “We are hoping to make computers smarter by making them a little more like children.”

    For example, researchers said, computers programmed with kids’ cognitive smarts could interact more intelligently and responsively with humans in applications such as computer tutoring programs and phone-answering robots.

    And that’s not all.

    “Your computer could be able to discover causal relationships, ranging from simple cases such as recognizing that you work more slowly when you haven’t had coffee, to complex ones such as identifying which genes cause greater susceptibility to diseases,” said Griffiths. He is applying a statistical method known as Bayesian probability theory to translate the calculations that children make during learning tasks into computational models.

    This spring, to consolidate their growing body of work on infant, toddler and preschooler cognition, Gopnik, Griffiths and other UC Berkeley psychologists, computer scientists and philosophers will launch a multidisciplinary center at the campus’s Institute of Human Development to pursue this line of research.

    A growing body of child cognition research at UC Berkeley suggests that parents and educators put aside the flash cards, electronic learning games and rote-memory tasks and set kids free to discover and investigate.

    “Spontaneous and ‘pretend play’ is just as important as reading and writing drills,” Gopnik said.

    Of all the primates, Gopnik said, humans have the longest childhoods, and this extended period of nurturing, learning and exploration is key to human survival. The healthy newborn brain contains a lifetime’s supply of some 100 billion neurons which, as the baby matures, grow a vast network of synapses or neural connections – about 15,000 by the age of 2 or 3 – that enable children to learn languages, become socialized and figure out how to survive and thrive in their environment.

    Adults, meanwhile, stop using their powers of imagination and hypothetical reasoning as they focus on what is most relevant to their goals, Gopnik said. The combination of goal-minded adults and open-minded children is ideal for teaching computers new tricks.

    “We need both blue-sky speculation and hard-nosed planning,” Gopnik said. Researchers aim to achieve this symbiosis by tracking and making computational models of the cognitive steps that children take to solve problems in the following and other experiments.

    In UC Berkeley psychologist Fei Xu’s Infant Cognition and Language Lab, pre-verbal babies are tested to see if they can figure out the odds of getting the color of lollipop they want based on the proportions of black and pink lollipops they can see in two separate jars. One jar holds more pink lollipops than black ones, and the other holds more black than pink.

    After the baby sees the ratio of pink to black lollipops in each jar, a lollipop from each jar is covered, so the color is hidden, then removed and placed in a covered canister next to the jar. The baby is invited to take a lollipop and, in most cases, crawls towards the canister closest to the jar that held more pink lollipops.

    “We think babies are making calculations in their heads about which side to crawl to, to get the lollipop that they want,” Xu said.

    Gopnik is studying the “golden age of pretending,” which typically happens between ages 2 and 5, when children create and inhabit alternate universes. In one of her experiments, preschoolers sing “Happy Birthday” whenever a toy monkey appears and a music player is switched on. When the music player is suddenly removed, preschoolers swiftly adapt to the change by using a wooden block to replace the music player so the fun game can continue.

    Earlier experiments by Gopnik — including one in which she makes facial expressions while tasting different kinds of foods to see if toddlers can pick up on her preferences — challenge common assumptions that young children are self-centered and lack empathy, said Gopnik, and indicate that, at an early age, they can place themselves in other people’s shoes.

    UC Berkeley psychologists Tania Lombrozo and Elizabeth Bonawitz are finding that preschoolers don’t necessarily go with the simplest explanation, especially when presented with new evidence. In an experiment conducted at Berkeley and the Massachusetts Institute of Technology, preschoolers were shown a toy that lit up and spun around. They were told that a red block made the toy light up, a green one made it spin and a blue one could do both.

    It would have been easiest to assume the blue block was activating the toy when it simultaneously spun and lit up. But when the preschoolers saw there were very few blue blocks compared to red and green ones, many of them calculated the odds and decided that a combination of red and green blocks was causing the toy to spin and light up at the same time, which is an appropriate answer.

    “In other words, children went with simplicity when there wasn’t strong evidence for an alternative, but as evidence accumulated, they followed its lead,” Lombrozo said. Like the children in the study, computers would also benefit from looking at new possibilities for cause and effect based on changing odds.

    Overall, the UC Berkeley researchers say they will apply what they have learned from the exploratory and “probabilistic” reasoning demonstrated by the youngsters in these and other experiments to make computers smarter, more adaptable — and more human.

  • E-Waste Recycling Turns Phones Into Gold

    E-Waste Recycling Turns Phones Into Gold

    We throw things away. It’s just part of being a consumer culture. If a phone breaks or we’re done with the current technology, we can either sell it or throw it away. Unfortunately, the majority of American seem to think they can only throw away that old cell phone.

    Today’s infographic from Server Monkey seeks to inform Americans of all the technology they’re throwing away and how to reduce their e-waste. From computers to keyboards, it seems that the majority of consumer technology that Americans use is thrown away every year. The stats are somewhat alarming considering just how many computers are sold every year. What does the average consumer do with their old computer once they buy a new one? Well, according to this infographic, only 39.7 percent of Americans actually recycled them. That’s a majority of computers in this country just being thrown away to end up in some landfill.

    The amount of precious metals in cell phones is the most surprising though. Your phone is a veritable gold mine that also contains silver, palladium and copper. Although your phone has to be part of a mass recycling of 1 million cell phones to get 50 pounds of gold, it doesn’t seem that much when you consider how many cell phones there are in the world right now.

    Of course, the main problem when it comes to e-waste is the pollution and poisons that seep out of them over the years. It’s especially worrisome considering that rain can bring these poisons into rivers which can get into our drinking water. Some of the worst materials to come out of e-waste are mercury, lead and arsenic.

    While you probably learned the three Rs (Reduce, Reuse and Recycle) in elementary school, e-waste goes by the RRS (Repurpose, Recycle and Sell) method. I personally suggest going with the sell tactic especially with all the great deals coming from retailers who want your old iPad 2 on the eve of the new iPad launch.

    The Growing E-Waste Epidemic [Infographic]
    Infographic via: ServerMonkey.com, the Industry Leader for Refurbished Servers

  • Macbook Pro Rumor: Big Redesign Coming This Year

    Rumor has it that Apple is readying a massive refresh of its Macbook Pro line, to make them more like the Macbook Air in design.

    The rumor comes from AppleInsider, which cites “people familiar with Apple’s roadmap”. The publication reports:

    This will include new, ultra-thin unibody enclosures that jettison yesteryear technologies like optical disk drives and traditional hard drives in favor of models with lightweight chassis that employ flash-memory based solid-state drives, instant-on capabilities, extended battery life, and rely on digital distribution for software and media.

    “They’re all going to look like MacBook Airs,” one person familiar with the new MacBook Pro designs told AppleInsider. Meanwhile, existing MacBook Pro designs are expected to be phased out over the course of the year.

    The late Steve Jobs was known for his focus on design, sometimes at the cost of functionality (see Antennagate). It sounds like design is still a vital part of Apple’s mentality under CEO Tim Cook so far.

    The iPhone and iPad might be the sexier Apple products these days, but Macs continue to sell quite well too. Apple’s Q1 earnings report indicate that Mac sales were up 26%, having sold 5.2 million Macs in the quarter. In addition to that, over 100 million apps were downloaded from the Mac App Store in less than a year.

    Interestingly, however, iOS has eclipsed Mac OS X in web market share, according to a recent study from Chitika.

  • Jacob Goldman, Xerox Lab Founder, Is Dead

    Jacob Goldman, Xerox Lab Founder, Is Dead

    100% of you would probably not be reading this if it hadn’t been for Dr. Jacob E. Goldman.

    While he wasn’t inventor of Xerox copiers, Jacob Goldman, a physicist and visionary, made an even more important contribution to your life: he was the chief scientist of Xerox’s research center that invented the modern personal computer. So today, when you bitTorrent the Fast and Furious series or tweet about how long it’s taking you to move through the ordering line at Panera, take a moment to respectful observe the contribution of Dr. Goldman for he has passed. He was 90.

    Xerox has been so successful in the realm of office copiers that its synonymous with any copying machine, not to mention that it’s now been verbified (“Peeeter, what’s happening? Hey, could you go Xerox 30 copies of this for me? Greeeaaaat, thanks.”). Foreseeing that the people of the future would want to copy things in a new way, a la “copy and paste,” Xerox entered the realm of computer manufacturing. Xerox brought in Dr. Goldman to helm the mission to shrink and personalize the computer (these were still the days of living room-sized computers you see in old sci-fi war rooms) and, voila, about 40 years later you probably couldn’t figure out how to turn on a faucet without your personal computer.

    While Xerox fumbled their future with personal computers, you still have Dr. Goldman to thank for the fact that you can read this from your laptop, tablet, phone, whatever. Just think, we might not have had Saint Jobs had it not been for Dr. Goldman’s innovative, brilliant mind.

  • Look Into The Future With IBM’s 5-in-5 Tech Predictions

    Look Into The Future With IBM’s 5-in-5 Tech Predictions

    Every year since 2006, IBM has released a “Five-In-Five” list that details their predictions for the future of technology. Basically, IBM picks five innovations that will impact the way we live in the next five years.

    This year, they look at things like mind-reading, spam, and sustainable energy. Let’s take a look into the future, according to IBM.

    1. People will power their houses with energy that they create themselves.  True renewable energy will become the norm, allowing people to power their homes and work places with simple movement – like the turning of the spokes on a bicycle or the running of water through pipes.
    2. No more passwords.  Biometric tools like retina scanners and voice recognition will totally replace traditional passwords, and you’ll be able to use these login devices at places like the ATM machine.
    3. Your devices will read your mind.  In the next 5 years, scientists will have discovered a way to link your mind to you devices so that you can just think about calling someone and it will happen.  Further down the road, people might just be able to simply think about what they want to say and have it typed out for them on their computers.
    4. Mobile Technology will help close the digital divide.  They predict that within 5 years, 80% of the global population will own a mobile device.  communities all over the world will be able to use this technology to access mobile commerce programs and virtual health care.
    5. Spam will become personal.  “Junk mail will become priority mail. In five years, unsolicited advertisements may feel so personalized and relevant it may seem spam is dead.”

    Not all of the past IBM predictions have come to pass. For instance, we don’t quite have mind reading mobile phones yet (predicted back in 2006) and we haven’t been able to eradicate the “forgetting” part of aging (predicted in 2008). But some of the predictions have definitely come true.

    Two of the first year’s predictions, for instance, have pretty much come true:

    We will be able to access healthcare remotely from just about anywhere in the world. Today, through telemedicine, patients can connect with physicians or specialists from just about anywhere via inexpensive computers and broadband networks. Doctors can view x-rays and other diagnostic imagery from thousands of miles away.

    Technologies the size of a few atoms will address areas of environmental importance. Nanotechnology is now used in countless fields and industries, including agriculture, biotechnology and sensor networks, enabling us to understand and interact with the natural environment like never before.

    Plus, they predicted that users would soon talk to the web, and the web would talk back. Sounds a little bit like Siri and other voice assistants, no?

    Which of IBM’s 2011 predictions do you see coming true in the next 5 years? Which ones seem just a little too ambitious? Let us know in the comments.

  • Asus Eee Pad Transformer Prime Now Available for Pre-Order

    You can now pre-order the Asus Eee Pad Transformer Prime.

    If you’re unfamiliar with the device, it’s the first quad-core Anroid tablet. It has a 8.3mm-thick body, a 1280×800 display, a Nvidia Tegra 3 processor, up to 12 hours of battery life, 1 GB of RAM, an 8-megapixel rear-facing camera, a 1.2 megapixel front-facing camera, and a microSD slot. The entire device only weighs in at 1.29 pounds.

    The updated Transformer Prime will run Android 3.2 when it ships. Owners will be able to upgrade to Android 4.0, a.k.a Ice Cream Sandwich, in the near future.

    The base price for the device wis $499 for the 32GB version and $599 for the 64GB. You can also purchase an attachable keyboard, which will cost you $149. The updated Transformer Prime is expected to ship sometime in December of this year.

    It might be a hard sell, espicially when you consider prices for Apple’s iPad are compraable. Which do you think consumers would be more likely to purchase?

  • Computers: From Filling a Room to Filling Your Bloodstream

    Okay, I’ll be honest.  The concept of a room-sized computer is so foreign to me that I can’t even describe it to you.  As a mid-80’s baby, the biggest hunk of computer machinery I ever used was the old family HP Pavillion desktop back in the mid 90s.

    But I do have a sense of the history of computers, and like the rest of the world I am amazed by the journey they’ve taken in the last 50 years.  We take for granted our computers we carry around in our pockets –  our 16 GB iPhones and our 160 GB iPods.  In 1980, 2GB of storage meant a system the size of a refrigerator.  Yeah, fit that into your skinny jeans.

    This infographic made by onlinecomputersciencedegree.com is a nerdgasm, combining the history and possible future of computing with REM.  Based around their 1987 classic and Independence Day alien-signalling song “It’s the End of the World as We Know It (And I Feel Fine), it charts the shrinking physical size and expanding capabilities of computers from 1958 to 2010 and beyond.

    From Jack Kelby building the world’s first integrated circuit in 1958, to the first laptop Osbourne 1 being created in 1981, to current drives capable of holding 2TB in the size of a book – it charts the milestones.

    It also begins to speculate about computing in the future – fiber-optic computers that run on light instead of electricity and DNA-based computing that could replicate biological entities in order to solve health problems.

    Oh yeah, and of course they retool the lyrics of a great song to say things like  “more efficient software!”

    Lenny Bruce Gordon Moore is not afraid. Check it out:

    End of Computers
    Via: OnlineComputerScienceDegree.com

  • Sitting Down All Day Is Killing You

    Sitting Down All Day Is Killing You

    As I type this, I am seated in my comfy leather chair. I’m sure that many of you who are reading this are also situated in some kind of seat. How long have you been sitting there? When is the last time you got up and took a stroll around the office?

    It is a fact that many people’s jobs these days require them to spend a copious amount of time in front of a computer. Writers, Coders, Editors, Secretaries – the list goes on. According to reports that have been flooding in during the last year or so, the more hours you spend during the day sitting, the greater you risk of early death.

    First, Men’s Health reported on a study that concluded this fact. Then the New York times reported on another study that found that people had a greater risk of obesity, heart disease and type 2 diabetes when they sat an extra 6-8 hours a day.

    The posture of sitting itself probably isn’t worse than any other type of daytime physical inactivity, like lying on the couch watching “Wheel of Fortune.” But for most of us, when we’re awake and not moving, we’re sitting. This is your body on chairs: Electrical activity in the muscles drops — “the muscles go as silent as those of a dead horse,” Hamilton says — leading to a cascade of harmful metabolic effects. Your calorie-burning rate immediately plunges to about one per minute, a third of what it would be if you got up and walked. Insulin effectiveness drops within a single day, and the risk of developing Type 2 diabetes rises. So does the risk of being obese. The enzymes responsible for breaking down lipids and triglycerides — for “vacuuming up fat out of the bloodstream,” as Hamilton puts it — plunge, which in turn causes the levels of good (HDL) cholesterol to fall.

    So all this bad stuff over a lifetime adds up, according to researchers.

    Today comes this awesome new infographic courtesy of medicalbillingnacoding.org. Yes, the information provided is scary, especially for someone who likes to sit as much as I do. But the visuals are pretty top notch, especially the presentation of “sitting” as a jagged, hooded demon poised dramatically over the head of an unsuspecting office worker.

    Many propose stand up desks as a way to solve this problem. Others suggest frequent breaks from sitting on the job to perform various quick exercises. As someone with terrible knees, standing desks make me cringe. Maybe I’ll rethink the whole jumping jacks idea.

    Sitting is Killing You
    Via: Medical Billing And Coding

  • Apple Macbook Pro Gets Processor, Graphics, Thunderbolt I/O Update

    The Apple Macbook Pro family has been updated with new next-generation processors and graphics, high-speed Thunderbolt I/O technology and a new FaceTime HD camera. 

    FaceTime has become a popular selling point of Apple’s iPhone, and the iPad 2 is expected to accommodate it as well. 

    "The new MacBook Pro brings next generation dual and quad Core processors, high performance graphics, Thunderbolt technology and FaceTime HD to the great design loved by our pro customers," said Apple SVP of Worldwide Product Marketing Philip Schiller. "Thunderbolt is a revolutionary new I/O technology that delivers an amazing 10 gigabits per second and can support every important I/O standard which is ideal for the new MacBook Pro."

    Apple Macbook Pro family gets updated

    Apple says in its announcement, "MacBook Pro is the first computer on the market to include the groundbreaking Thunderbolt I/O technology. Developed by Intel with collaboration from Apple, Thunderbolt enables expandability never before possible on a notebook computer. Featuring two bi-directional channels with transfer speeds up to an amazing 10Gbps each, Thunderbolt delivers PCI Express directly to external high performance peripherals such as RAID arrays, and can support FireWire and USB consumer devices and Gigabit Ethernet networks via adapters. Thunderbolt also supports DisplayPort for high resolution displays and works with existing adapters for HDMI, DVI and VGA displays. Freely available for implementation on systems, cables and devices, Thunderbolt technology is expected to be widely adopted as a new standard for high performance I/O."

    "With Apple’s innovative FaceTime video calling software, the new camera allows high definition video calls between all new MacBook Pro models and supports standard resolution calls with other Intel-based Macs, iPhone 4 and the current generation iPod touch," the company says. "FaceTime is included with all new MacBook Pro models and is available for other Intel-based Macs from the Mac App Store for 99 cents. The MacBook Pro lineup continues to feature its gorgeous aluminum unibody enclosure, glass Multi-Touch trackpad, LED-backlit widescreen display, illuminated full-size keyboard and 7-hour battery."

    There is a 13-inch Apple MacBook Pro model, as well as a 15-inch and a 17-inch model. The 13-inch version is available in a 2.3 GHz Dual-Core Intel Core i5 and 320GB hard drive version for $1,199, as well as a 2.7 GHz Dual-Core Intel Core i5 and 500GB hard drive version for $1,499. The 15-inch MacBook Pro is available in a 2.0 GHz Quad-Core Intel Core i7, AMD Radeon HD 6490M and 500GB hard drive version for $1,799 and a 2.2 GHz Quad-Core Intel Core i7, AMD Radeon HD 6750M and 750GB hard drive version for $2,199. The new 17-inch Apple MacBook Pro comes in just one version –  2.2 GHz Quad-Core Intel Core i7, AMD Radeon HD 6750M and 750GB hard drive, and is priced at $2,499.

    Apple has an event scheduled for March 2nd, where it will almost certainly unveil the iPad 2. Stay tuned for that. Meanwhile, the iPad continues to dominate tablet shipments.

  • HP Unveils New Touch Notebook and Some New Minis

    HP unveiled a few new mobile devices today at CES. These include a new TouchSmart notebook, and a handful of minis.

    The HP TouchSmart tm2 builds on HP’s existing TouchSmart software, and updates the tx2. "With the convertible tm2, customers can choose the product configuration and input method that is most comfortable and natural, whether on a couch, in an airplane or while surfing the web," the company says. "As a traditional notebook, the tm2 offers a keyboard and touch-enabled display for input. Converted to a slate, the tm2 morphs into a sketchpad with digital pen, allowing artists to sketch on the go and students to take notes in class."

    HP lists the following new touch applications that enhance the tm2 experience:

    – BumpTop, a touch-enhanced 3-D interface for photos and documents that allows users to spatially organize and “toss” or share files and photos to social media sites or email.

    – DigiFish Dolphin, an interactive 3-D screensaver that recreates an ocean environment to be experienced via touch or a mouse.

    – Corel Paint it! Touch, which allows users to draw and paint or turn photos into paintings using fingertips.

    More details about the TouchSmart tm2 can be found here.

    Touchsmart tm2

    The HP Mini 5102 is described as a "full-performance "netbook designed for mobile professionals and students. It comes in either a standard or touch-enabled version. The touchscreen option includes multitouch gestures, finger taps, and swipes across the screen for navigation. More can be read about this here.

    HP Mini 5102

    Finally, unveiled the HP Mini 210 and HP Mini 2102, with optional 3G broadband connectivity and GPS in multiple color options. HP says new software enhances the HP Mini 210’s multimedia experience:

    – HP CloudDrive powered by ZumoDrive allows users to access their synchronized content – documents, photos and music – from the cloud, without having to store it on a local drive.

    – HP MediaStream allows users to stream multimedia content from one PC to another over the Internet, without requiring data to be downloaded.

    – HP QuickSync software, also available on the HP Mini 2102, automatically synchronizes files created or edited on the road with a home or business PC over a wireless connection when connected to the same network.

    – HP QuickWeb allows users to access the web without booting up the notebook by simply pushing a button. In seconds, users have a connection to the Internet and can access websites and other content that normally requires a standard browser, as well as photos, music and more.

    More details about the Mini 210 and Mini 2102 can be found here.

    The HP TouchSmart tm2 is expected to be available in the US on Jan. 17 in all colors with a starting price of $949. The HP Mini 5102 is expected to be available in the US this month with a starting price of $399. The HP Mini 210 is expected to be available in the US tomorrow with a starting price of $299 for Windows 7.The HP Mini 2102 is also expected to be available in the US tomorrow with a starting price of $329.

    Watch for more WebProNews coverage of CES, with exclusive video interviews coming soon.


    Related Articles:

    > CES About to Kick Off the Year in Technology

    > Cisco Leaving a Big Mark on Consumer Electronics Show

    > Looking Back on CES