Advancing AI theory with a first-principles understanding of deep neural networks
#162 — June 28, 2021 | View in browser |
AI Digest
Spread the word, build the community, share the knowledge – invite your friends.
this week's favorite
Advancing AI theory with a first-principles understanding of deep neural networks
The steam engine powered the Industrial Revolution and changed manufacturing forever — and yet it wasn’t until the laws of thermodynamics and the principles of statistical mechanics were developed over the following century that scientists could fully explain at a theoretical level why and how it worked.
Meet Kats — a one-stop shop for time series analysis
A new library to analyze time series data. Kats is a lightweight, easy-to-use, and generalizable framework for generic time series analysis, including forecasting, anomaly detection, multivariate analysis, and feature extraction/embedding. To the best of our knowledge, Kats is the first comprehensive Python library for generic time series analysis, which provides both classical and advanced techniques to model time series data.
In an autonomous driving system, it is essential to recognize vehicles, pedestrians and cyclists from images. Besides the high accuracy of the prediction, the requirement of real-time running brings new challenges for convolutional network models. In this report, we introduce a real-time method to detect the 2D objects from images.
VOLO: Vision outlooker for visual recognition
Visual recognition has been dominated by convolutionalneural networks (CNNs) for years. Though recently the pre-vailing vision transformers (ViTs) have shown great poten-tial of self-attention based models in ImageNet classifica-tion, their performance is still inferior to latest SOTA CNNsif no extra data are provided. In this work, we aim to closethe performance gap and demonstrate that attention-basedmodels are indeed able to outperform CNNs.
Regularization is all you need: simple neural nets can excel on tabular data
Tabular datasets are the last "unconquered castle" for deep learning, with traditional ML methods like Gradient-Boosted Decision Trees still performing strongly even against recent specialized neural architectures. In this paper, we hypothesize that the key to boosting the performance of neural networks lies in rethinking the joint and simultaneous application of a large set of modern regularization techniques.
newsletters
Older messages
Thinking like transformers
Sunday, June 20, 2021
And more news, tutorials and articles about AI, machine learning, and data science in this week's issue. #161 — June 21, 2021 View in browser AI Digest Spread the word, build the community, share
HyperLib: Deep learning in the Hyperbolic space
Sunday, June 6, 2021
And more news, tutorials and articles about AI, machine learning, and data science in this week's issue. #159 — June 07, 2021 View in browser AI Digest Spread the word, build the community, share
An introduction to knowledge graphs
Sunday, May 30, 2021
And more news, tutorials and articles about AI, machine learning, and data science in this week's issue. #158 — May 31, 2021 View in browser AI Digest Spread the word, build the community, share
Optimizing payments with machine learning
Sunday, May 23, 2021
And more news, tutorials and articles about AI, machine learning, and data science in this week's issue. #157 — May 24, 2021 View in browser AI Digest Spread the word, build the community, share
The modern mathematics of deep learning
Sunday, May 16, 2021
And more news, tutorials and articles about AI, machine learning, and data science in this week's issue. #156 — May 17, 2021 View in browser AI Digest Spread the word, build the community, share
You Might Also Like
Tesla Autopilot investigation closed
Friday, April 26, 2024
Inside the IBM-HashiCorp deal and Thoma Bravo takes another company private View this email online in your browser By Christine Hall Friday, April 26, 2024 Good afternoon, and welcome to TechCrunch PM.
Microsoft's and Google's bet on AI is paying off - Weekly News Roundup - Issue #464
Friday, April 26, 2024
Plus: AI-controlled F-16 has been dogfighting with humans; Grok-1.5 Vision; BionicBee; Microsoft's AI generates realistic deepfakes from a single photo; and more! ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
🤓 The Meta Quest Might Be the VR Steam Deck Soon — Games to Play After Finishing Wordle
Friday, April 26, 2024
Also: Why a Cheap Soundbar Is Better Than Nothing, and More! How-To Geek Logo April 26, 2024 Did You Know TMI: Rhinotillexomania is the medical term for obsessive nose picking. 🖥️ Get Those Updates
JSK Daily for Apr 26, 2024
Friday, April 26, 2024
JSK Daily for Apr 26, 2024 View this email in your browser A community curated daily e-mail of JavaScript news A Solid primer on Signals with Ryan Carniato (JS Party #320) Ryan Carniato joins Amal
So are we banning TikTok or what?
Friday, April 26, 2024
Also: Can an influencer really tank an $800M company? View this email online in your browser By Haje Jan Kamps Friday, April 26, 2024 Image Credits: Jonathan Raa/NurPhoto / Getty Images Welcome to
[AI Incubator] 300+ people are already in. Enrollment closes tonight at 11:59pm PT.
Friday, April 26, 2024
How to decide if you're ready.
Daily Coding Problem: Problem #1423 [Medium]
Friday, April 26, 2024
Daily Coding Problem Good morning! Here's your coding interview problem for today. This problem was asked by Google. You are given an array of nonnegative integers. Let's say you start at the
Data science for Product Managers
Friday, April 26, 2024
Crucial resources to empower you with data that matters.
Inner Thoughts
Friday, April 26, 2024
'The Inner Circle' Comes Around... Inner Thoughts By MG Siegler • 26 Apr 2024 View in browser View in browser If you'll allow me a brief meta blurb this week (not a Meta blurb, plenty of
Digest #135: Kubernetes Hacks, Terraform CI/CD, HashiCorp Acquisition, AWS Data Transfer Monitoring
Friday, April 26, 2024
Explore Advanced Kubernetes Techniques, Dive Into Terraform CI/CD Frameworks, Monitor AWS Data Transfer, and Explore Cloud Security with Gitleaks! ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏