Display 
everybody's
notes  in  tooob|machine

Friday, May 29, 08:34PM  by:shuri
deep,
innovation,
Viewable by:

source Forbes Insights: How Europe Is Leading The Way With Responsible AI
Saskia Steinacker, Global Head of Digital Transformation at Bayer, shares how the EU is putting human rights and ethics at the center of AI development.
Friday, May 29, 08:25PM  by:shuri
Viewable by:

source Language Models are Few-Shot Learners
Recent work has demonstrated substantial gains on many NLP tasks and benchmarks by pre-training on a large corpus of text followed by fine-tuning on a specific task. While typically task-agnostic in architecture, this method still requires task-specific fine-tuning datasets of thousands or tens of thousands of examples. By contrast, humans can generally perform a new language task from only a few examples or from simple instructions - something which current NLP systems still largely struggle to do. Here we show that scaling up language models greatly improves task-agnostic, few-shot performance, sometimes even reaching competitiveness with prior state-of-the-art fine-tuning approaches. Specifically, we train GPT-3, an autoregressive language model with 175 billion parameters, 10x more than any previous non-sparse language model, and test its performance in the few-shot setting. For all tasks, GPT-3 is applied without any gradient updates or fine-tuning, with tasks and few-shot demonstrations specified purely via text interaction with the model. GPT-3 achieves strong performance on many NLP datasets, including translation, question-answering, and cloze tasks, as well as several tasks that require on-the-fly reasoning or domain adaptation, such as unscrambling words, using a novel word in a sentence, or performing 3-digit arithmetic. At the same time, we also identify some datasets where GPT-3's few-shot learning still struggles, as well as some datasets where GPT-3 faces methodological issues related to training on large web corpora. Finally, we find that GPT-3 can generate samples of news articles which human evaluators have difficulty distinguishing from articles written by humans. We discuss broader societal impacts of this finding and of GPT-3 in general.
Wednesday, May 27, 09:14AM  by:shuri
Viewable by:

source How to Build your own Feature Store - Logical Clocks
Given the increasing interest in feature stores, we share our own experience of building one to help others who are considering following us down the same path.
Wednesday, May 27, 09:13AM  by:shuri
Viewable by:

source A Complete 4-Year Course Plan for an Artificial Intelligence Undergraduate Degree - Mihail Eric
I describe in-depth the courses necessary for a 4-year undergraduate degree in artificial intelligence, assuming you step onto campus tomorrow.
Saturday, May 16, 10:52AM  by:shuri
deep,
upload,
free,
video phone,
camera phone,
sharing,
video,
Viewable by:

source R-squared or coefficient of determination | Regression | Probability and Statistics | Khan Academy
R-Squared or Coefficient of Determination Watch the next lesson: https://www.khanacademy.org/math/probability/regression/regression-correlation/v/calculating...
Thursday, May 14, 10:04AM  by:shuri
deep,
compute,
tesla a100,
nvidia,
ga100,
ampere,
Viewable by:

source Nvidia Unifies AI Compute With “Ampere” GPU
The in-person GPU Technical Conference held annually in San Jose may have been canceled in March thanks to the coronavirus pandemic, but behind the scenes
Friday, May 08, 12:42PM  by:shuri
Viewable by:

source Perovskite neural trees
Trees are used by animals, humans and machines to classify information and make decisions. Natural tree structures displayed by synapses of the brain involves potentiation and depression capable of branching and is essential for survival and learning. Demonstration of such features in synthetic matter is challenging due to the need to host a complex energy landscape capable of learning, memory and electrical interrogation. We report experimental realization of tree-like conductance states at room temperature in strongly correlated perovskite nickelates by modulating proton distribution under high speed electric pulses. This demonstration represents physical realization of ultrametric trees, a concept from number theory applied to the study of spin glasses in physics that inspired early neural network theory dating almost forty years ago. We apply the tree-like memory features in spiking neural networks to demonstrate high fidelity object recognition, and in future can open new directions for neuromorphic computing and artificial intelligence. Designing energy efficient and scalable artificial networks for neuromorphic computing remains a challenge. Here, the authors demonstrate tree-like conductance states at room temperature in strongly correlated perovskite nickelates by modulating proton distribution under high speed electric pulses.
Friday, May 08, 04:01AM  by:shuri
Viewable by:

source Jukebox
We’re introducing Jukebox, a neural net that generates music, including rudimentary singing, as raw audio in a variety of genres and artist styles.
Friday, May 08, 03:55AM  by:shuri
Viewable by:

source Why We Need DevOps for ML Data - Tecton
Data is the hardest part of machine learning. We need to apply MLOps practices to the ML data lifecycle to get features to production quickly and reliably.
Saturday, May 02, 01:28PM  by:shuri
Viewable by:

source My First Year as a Freelance AI Engineer
This week marks my one-year anniversary of quitting my full-time job and becoming a freelance AI engineer. I’m writing down my thoughts and experience here so that this might be useful if you are even vaguely interested.
Friday, May 01, 07:29PM  by:shuri
Viewable by:

source YOLOv4: Optimal Speed and Accuracy of Object Detection
There are a huge number of features which are said to improve Convolutional Neural Network (CNN) accuracy. Practical testing of combinations of such features on large datasets, and theoretical justification of the result, is required. Some features operate on certain models exclusively and for certain problems exclusively, or only for small-scale datasets; while some features, such as batch-normalization and residual-connections, are applicable to the majority of models, tasks, and datasets. We assume that such universal features include Weighted-Residual-Connections (WRC), Cross-Stage-Partial-connections (CSP), Cross mini-Batch Normalization (CmBN), Self-adversarial-training (SAT) and Mish-activation. We use new features: WRC, CSP, CmBN, SAT, Mish activation, Mosaic data augmentation, CmBN, DropBlock regularization, and CIoU loss, and combine some of them to achieve state-of-the-art results: 43.5% AP (65.7% AP50) for the MS COCO dataset at a realtime speed of ~65 FPS on Tesla V100. Source code is at https://github.com/AlexeyAB/darknet
Friday, May 01, 07:18PM  by:shuri
Viewable by:

source Google Duplex now speaks Spanish, starts calling businesses in Spain to update hours
After expanding Duplex to the U.K., Australia, and Canada in early April, Google brought the AI calling assistant to Spain in a limited capacity.
Wednesday, April 29, 07:33PM  by:shuri
deep,
uber,
tecton.ai,
sequoia capital,
machine learning,
andreessen horowitz,
Viewable by:

source Tecton.ai emerges from stealth with $20M Series A to build machine learning platform – TechCrunch
Three former Uber engineers, who helped build the company’s Michelangelo machine learning platform, left the company last year to form Tecton.ai and build an operational machine learning platform for everyone else. Today the company announced a $20 million Series A from a couple of high-profile investors. Andreessen Horowitz and Sequoia Capital co-led the round with […]
Thursday, April 23, 02:28PM  by:shuri
deep,
upload,
free,
video phone,
camera phone,
sharing,
video,
Viewable by:

source Deep Learning UC Berkeley STAT-157 2019 - YouTube
Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.
Thursday, April 23, 01:07PM  by:shuri
Viewable by:

source Importing .py files in Google Colab
Is there any way to upload my code in .py files and import them in colab code cells? The other way I found is to create a local Jupyter notebook then upload it to Colab, is it the only way?
Thursday, April 23, 12:43PM  by:shuri
Viewable by:

source 2 ways to upload a csv file to a Google Colab Notebook
This article will help you get started in data science by letting you upload your file to a Python Notebook in Google Colab. Google Colab is a cloud service, offered by Google (free), based on…
Thursday, April 23, 12:43PM  by:shuri
Viewable by:

source Get Started: 3 Ways to Load CSV files into Colab - Towards Data Science
Data science is nothing without data. Yes, that’s obvious. What is not so obvious is the series of steps involved in getting the data into a format which allows you to explore the data. You may be in…
Wednesday, April 22, 03:27PM  by:shuri
Viewable by:

source Interpretability 2020
An online research report on interpretability for machine learning by Cloudera Fast Forward.
Wednesday, April 08, 03:51PM  by:shuri
deep,
python,
machine learning,
swift,
Viewable by:

source Swift: Google’s bet on differentiable programming | Tryolabs Blog
Google's plans on making Swift the first mainstream language with first-class language-integrated differentiable programming capabilities. What's so cool about Swift?
Wednesday, April 01, 12:59PM  by:shuri
Viewable by:
Monday, March 30, 07:36PM  by:shuri
deep,
upload,
free,
video phone,
camera phone,
sharing,
video,
Viewable by: