Free resource of educational web tools, 21st century skills, tips and tutorials on how teachers and students integrate technology into education
Via Tom D'Amico (@TDOttawa) , michel verstrepen
Get Started for FREE
Sign up with Facebook Sign up with X
I don't have a Facebook or a X account
Your new post is loading...
Your new post is loading...
Omar Elizondo's curator insight,
May 16, 2019 8:10 AM
Many elementary school teachers love to teach reading and writing, but are less comfortable with science and math. It’s not a hard and fast truth, of course, but learning to read is a big focus of the early school years, so it makes sense that teachers who gravitate toward elementary school like teaching literacy. But it’s also important to expose kids to science early and get them excited about the practices that define scientific inquiry.
Gonzalo San Gil, PhD.'s curator insight,
March 31, 2015 4:12 AM
# ! Wonder if Queen Anne did expect all this mess
Beth Dichter's curator insight,
April 29, 2014 9:50 PM
Richard Byrne provides a number of links to websites that have great hands-on activities for students to learn science. The sites include: * Science is Fun - 25 chemistry experiments geared to students in grades 4 - 9 * Museum of Science and Industry in Chicago has activities in twelve topics. You will find experiments for students preK through grade 12. * Discover the World is from NOAA. A total of 43 experiments which are probably best for grades 4 - 8. * Squishy Circuits. Learn how to create the "dough" to create these circuits and watch a TedEd to learn more. There are many ideas to be found in this post and lots of fun for your students to experience while they explore and learn science!
luiy's curator insight,
February 26, 2014 6:23 AM
..... but, examined carefully, the articles seem more enthusiastic than substantive. As I wrote before, the story about Watson was off the mark factually. The deep-learning piece had problems, too. Sunday’s story is confused at best; there is nothing new in teaching computers to learn from their mistakes. Instead, the article seems to be about building computer chips that use “brainlike” algorithms, but the algorithms themselves aren’t new, either. As the author notes in passing, “the new computing approach” is “already in use by some large technology companies.” Mostly, the article seems to be about neuromorphic processors—computer processors that are organized to be somewhat brainlike—though, as the piece points out, they have been around since the nineteen-eighties. In fact, the core idea of Sunday’s article—nets based “on large groups of neuron-like elements … that learn from experience”—goes back over fifty years, to the well-known Perceptron, built by Frank Rosenblatt in 1957. (If you check the archives, the Times billed it as a revolution, with the headline “NEW NAVY DEVICE LEARNS BY DOING.” The New Yorker similarly gushed about the advancement.) The only new thing mentioned is a computer chip, as yet unproven but scheduled to be released this year, along with the claim that it can “potentially [make] the term ‘computer crash’ obsolete.” Steven Pinker wrote me an e-mail after reading the Timesstory, saying “We’re back in 1985!”—the last time there was huge hype in the mainstream media about neural networks.
R Schumacher & Associates LLC's curator insight,
January 15, 2014 1:43 PM
The monikers such as "deep learning" may be new, but Artificial Intelligence has always been the Holy Grail of computer science. The applications are many, and the path is becoming less of an uphill climb.
luiy's curator insight,
February 26, 2014 6:19 AM
Deep learning itself is a revival of an even older idea for computing: neural networks. These systems, loosely inspired by the densely interconnected neurons of the brain, mimic human learning by changing the strength of simulated neural connections on the basis of experience. Google Brain, with about 1 million simulated neurons and 1 billion simulated connections, was ten times larger than any deep neural network before it. Project founder Andrew Ng, now director of the Artificial Intelligence Laboratory at Stanford University in California, has gone on to make deep-learning systems ten times larger again.
Such advances make for exciting times in artificial intelligence (AI) — the often-frustrating attempt to get computers to think like humans. In the past few years, companies such as Google, Apple and IBM have been aggressively snapping up start-up companies and researchers with deep-learning expertise. For everyday consumers, the results include software better able to sort through photos, understand spoken commands and translate text from foreign languages. For scientists and industry, deep-learning computers can search for potential drug candidates, map real neural networks in the brain or predict the functions of proteins. |
Gonzalo San Gil, PhD.'s curator insight,
September 6, 2016 6:11 AM
# ! ... of #sharing, from #discoveries to #creation,
# ! for #advancing in #Mankind's #welfare.
Willem Kuypers's curator insight,
October 26, 2014 3:34 AM
Des sites pour trouver des informations pour les cours de sciences.
vicky carroll's comment,
May 12, 2017 2:09 AM
Fantastic resource for transforming learning by getting students to communicate/collaborate with external parties after researching topics. Would allow redefinition using the SAMR model.
Ness Crouch's curator insight,
March 29, 2014 5:13 PM
These motion infographics look interesting. I wonder if I can find content for my class?
Jeongbae Kong Enanum's curator insight,
August 16, 2014 9:48 AM
Won Ho :<생각이 깊은 교수님의 글이라서 연구해봐야겠다.> Why should the video watching previous to in-class? The core is quality video access and intensive in-class interaction. The lecture can't complete with these superb ones. 꼭 뒤집어야만 하는가? 내게 플립러닝의 핵심은 수준 높은 비디오와 강력한 상호작용이다. 순서와 방식은 여러 가지가 가능하다. 선생님이 개념 설명 행위는 여기 비디오를 보면 조만간 사라질 게 당연해 보인다.
María Dolores Díaz Noguera's curator insight,
February 4, 2016 7:39 AM
Flipping The Flipped Classroom - Motion Infographics For STEM Learning | @scoopit via @BethDichter http://sco.lt/...
luiy's curator insight,
February 26, 2014 6:28 AM
The map focuses on six big stories of science that we think will play out over the next decade:
1. Decrypting the Brain, 2. Hacking Space, 3. Massively Multiplayer Data, 4. Sea the Future, 5. Strange Matter, and 6. Engineered Evolution.
- See more at: http://www.iftf.org/our-work/people-technology/technology-horizons/the-future-of-science/#sthash.J6Ga3QHn.dpuf
R Schumacher & Associates LLC's curator insight,
January 15, 2014 1:43 PM
The monikers such as "deep learning" may be new, but Artificial Intelligence has always been the Holy Grail of computer science. The applications are many, and the path is becoming less of an uphill climb.
luiy's curator insight,
February 26, 2014 6:19 AM
Deep learning itself is a revival of an even older idea for computing: neural networks. These systems, loosely inspired by the densely interconnected neurons of the brain, mimic human learning by changing the strength of simulated neural connections on the basis of experience. Google Brain, with about 1 million simulated neurons and 1 billion simulated connections, was ten times larger than any deep neural network before it. Project founder Andrew Ng, now director of the Artificial Intelligence Laboratory at Stanford University in California, has gone on to make deep-learning systems ten times larger again.
Such advances make for exciting times in artificial intelligence (AI) — the often-frustrating attempt to get computers to think like humans. In the past few years, companies such as Google, Apple and IBM have been aggressively snapping up start-up companies and researchers with deep-learning expertise. For everyday consumers, the results include software better able to sort through photos, understand spoken commands and translate text from foreign languages. For scientists and industry, deep-learning computers can search for potential drug candidates, map real neural networks in the brain or predict the functions of proteins. |