Starting from the AI Grant, see how Nat Friedman and Daniel Gross voted for half of the US artificial intelligence

Original Source: Cyber Zen Heart

Image source: Generated by Unbounded AI‌

From the must-have GitHub for developers, to the young unicorns Scale and Cohere, to this year's high-profile Character.ai, when I was researching excellent companies in the field of artificial intelligence, I often saw two names appearing in the "early stage" Investors" - Nat (Nathaniel) Friedman and Daniel Gross.

Since 2017, Nat and Daniel have also started a partnership investment in the field of artificial intelligence and established an institution called AI Grant. From the initial academic research funding fund to today's early venture capital fund, AI Grant's operation and investment model has helped me in "how to be an early investor in the field of artificial intelligence that is more helpful to entrepreneurial teams" Brings a lot of inspiration and inspiration. Here, I also hope to share the growth and investment stories of Nat and Daniel with you.

Enjoy!

**Einstein was a patent clerk working in Bern. He has a lot of ideas that everyone thinks are crazy. **

**But often, it is the "outsiders" who have the freshest and best ideas. **

**Our goal is to find and fund them. **

Two protagonists

Open Source Pioneer Nat Friedman

「It's hard to imagine myself doing something other than founding a startup. But you never know. I'm open to anything.」

On the personal website, Nat (Nathaniel) Friedman's self-introduction has this sentence: ** "I started surfing the Internet in 1991, and the Internet is my real hometown"** - he is not exaggerating.

Born in 1977, Nat Friedman learned software development at the age of six. In 1991, Linus Torvalds, a Finnish young man on the other side of the ocean, publicly released Linux. Nat, a young Virginia town boy who just started surfing the Internet, quickly discovered it and became a well-known hacker in the Linux community with his clever brain and infinite curiosity—— The open source community has thus become the starting point of his career and the basis of close friendships.

In 1999, Nat successfully graduated from the MIT Department of Computer and Mathematics. At the age of 22, he had a firm belief: he only wanted to work in the open source field. **So even penniless, he rejects all job offers and secretly lives on a stinky old red sofa in the common room of an MIT dorm because of the fast internet.

Fortunately, Tim Ney from the Free Software Foundation reached out in time and wrote a check for $350 to Nat. Ask him to do anything - so what the hell did Nat do with the money?

Back in 1996, Nat, who was still a college freshman, wrote the IRC [1] On the network LinuxNet, I met Miguel de Icaza, a Mexican youth who dropped out of the mathematics department and devoted himself to free software development. Then in the summer of 1997, Nat, who was an intern at Microsoft, formally met Miguel, who came to interview for a position on the Internet Explorer Unix team. However, Miguel, who could not obtain a work visa because he did not have a university degree, did not join Microsoft. Instead, he co-initiated the open source project GNOME with his friends in August of the same year. In April 1999, on the eve of **graduation, Nat proposed to Miguel to create a company to continue the development of GNOME, but they had no money - ** **Tim's check helped a lot. **

In October 1999, Nat and Miguel jointly established International GNOME Support (later known as Helix Code) to develop GNOME's infrastructure and applications. Eventually, the company was renamed Ximian and was acquired by Novell in August 2003.

Reflecting on their first entrepreneurial experience, Nat and Miguel write:

"Ximian is made up of like-minded friends. We started the company without any entrepreneurial, management or business experience. We learned on the job and were advised by friends who believed in our purpose and cared about our mission; 90% of Ximian's employees are Open source community contributors and people we met through mail or IRC; we had no management experience, which meant we made every textbook management mistake possible, but all of our friends and The employees support us."

** **After joining Novell, Nat was responsible for all Linux-related projects of the company and served as CTO of open source projects, freeing more than 6,000 employees from the shackles of Windows systems and Office suites, and turned to open source SUSE and OpenOffoce. In 2007, Nat moved to Munich, started the SUSE Studio project, and left the company after the product launch in 2009, when he was also newly married, "a natural break point and time to find something new" .

Judging from this experience, the working model of big companies has not digested Nat's creative soul. David Majda, who joined SUSE shortly after Nat's departure, blogged: "The application looks modern, is visually pleasing, and easy to use. It doesn't look like a product from a Linux company at all. More like a startup product. SUSE's uncanny ability to create such a first-class application convinced me to join the company and eventually the SUSE Studio team.  … After joining SUSE, I was curious what the secret sauce was to build this product, very Quickly stumbled upon Nat Friedman's name - the whole project was clearly his idea. He convinced management, put together a team of the best developers he could find, and ran it as a startup , and built the product over two years. Remember, this is a big company with Novell big-corporate-style managers on the one hand, and hardcore hackers from the Linux community on the other—it's not easy thing."

Before embarking on a 2010 round-the-world trip with his new wife, Nat said, "When our travels are over, my next step will probably be to start a company in the U.S.** It's hard to imagine anything other than starting a What else would I do** outside of a startup — but I don’t know exactly what I would do, I’m open to anything.

It didn't take long for the trip to stop - Novell was acquired by Attachmate at the end of April 2011, and the team led by Miguel was disbanded in early May. Nat returned to the United States, half a month later, on May 17th, the two with amazing execution ability cooperated again to establish Xamarin, and continued to run the open source cross-platform SDK Mono project abandoned by Attachmate. In 2016, this company Acquired by Microsoft for approximately $500 million.

After joining Microsoft, Nat still wanted to stick to being an "entrepreneur". At the beginning, he planned to leave after one or two years and start spending time on "side jobs" that he was interested in. During his tenure, in addition to initiating the AI Grant project, which will be described in detail later in this article, he also founded California YIMBY (Yes In My Back Yard), which is dedicated to solving the housing shortage in California.

**But it didn't take long for him to discover that Microsoft's new CEO, Satya Nadella, was a leader to learn from -- a manager who was open and always looking for something higher. **In 2017, Nat sent an email to Satya, proposing to acquire GitHub—even though Nat had just joined the job, Satya still made the largest developer-related acquisition in history a week later, and it was also Microsoft’s largest at the time. The large-scale acquisition case was fully authorized to him.

In 2018, Microsoft acquired GitHub amid doubts, and appointed Nat, who "makes developers most at ease"**, as CEO. On his first day in office, Nat wrote: "I will not ask for your trust, but I am committed to earning your trust." The result did not disappoint everyone: he not only kept the platform independent and neutral, but also maintained the independence and neutrality of the developer. The community has won a good reputation; created the star product GitHub Copilot, which continued to expand GitHub's influence; also acquired six companies including NPM, Semmle, Dependabot, and PullPanda, resulting in healthy growth in revenue and users, and finally handed over to Microsoft A good answer sheet.

Among them, I have to mention the building process of GitHub Copilot-this is also the best presentation of the efficient collaboration between Satya and Nat.

On June 11, 2020, OpenAI released GPT-3. Nat was shocked by its own capabilities and related demonstrations, and decided to do something about this model immediately. But at the time, he didn't know what it could be used for. Fortunately, the visionary Satya has already established a cooperative relationship with OpenAI, which gave Nat ample space to explore in uncertainty.

Soon, Nat found several outstanding developers in the GitHub community. With the question of “How to make a model that often makes mistakes”, everyone began to explore in two directions: Chatbots and Code generation. Two months later, they found that the direct application of GPT-3 to the chat scene was not enough - too large a model brought too high a delay, and it was difficult for users to really like such a chat object; and the development started in February of the following year Copilot is another way of thinking. It is like a "little assistant" sitting on the user's shoulders, solving problems with the user, appearing from time to time, helping the user to patch the code, and even generate a complete function-like a random winner Slot machines, while making users find it useful, are also a bit addictive-on June 29, 2021, GitHub Copilot was officially released, and has since been loved by millions of programmers.

On November 3, 2021, Nat sent an email to the GitHub team** "I am continuing my next adventure: providing support, advice to founders and developers who are using technology to create the future and seize some big opportunities and Investment"**, thus becoming a full-time investor.

Nat Friedman's life credo As humans, we have the right (perhaps our moral responsibility) to reshape the universe to our liking - technology, indeed knowledge, makes this possible - we should probably try to raise the ceiling, rather than the bottom line

** Enthusiasm matters! **- It is much easier to do something that interests you - Perhaps because of this, it is easier to do big things than small things - Progress requires energy as a necessary input

Moving Fast Matters- Because of more frequent contact with reality, we learn more per unit of time- Acting fast keeps us focused on what matters; no time for nonsense-"Slow is fake"- A week is 2% of a year - time is the denominator

The Efficient Market Hypothesis is a lie - at best it is a data loss badly revealed - the best things in life happen where EMH is wrong - in many cases modeling the world as 500 people vs 8 billion People are more accurate - "most people are other people"

WE KNOW LESS THAN WE THINK - Replication crisis is not an exception - Many things we believe are wrong - We often don't even ask the right questions

Cultural ban on micromanagement is detrimental - great individuals should be fully empowered to exercise their judgment - the goal is not to avoid mistakes; the goal is to achieve an irrelevant level of excellence in some dimension - the downsides are worth it

Small teams are better - faster decisions, fewer meetings, more fun - no need to split work for political reasons - no place for mediocrity (also pay more!) - large projects Projects are intellectually easier to solve than they appear - many tech companies are 2-10 times overstaffed

** Where do we get our own dopamine from? **- Answers can predict our behavior - It is better to get dopamine from improving your own ideas, not from validating them - It is OK to get dopamine from "making things happen"

We can do more than we can imagine- We are bound by invisible traditions- Laws of physics are the only limit

Genius Daniel Gross

「The most surprising part of the experience was how much it meant for someone to believe in me.」

Daniel Gross is undoubtedly a talented teenager.

Born in 1991, the year Nat Friedman started surfing the Internet, Daniel spent the first eighteen years of his life in Jerusalem until he graduated from high school. In his hometown, Daniel has always considered himself an "outlier". He has few friends and no enthusiasm for life, but programming is an exception-this is the only thing he loves, because in the world of programming, he Get maximum freedom to do what you want- the only limit is your imagination. **

In 2009, after graduating from high school, Daniel was admitted to Bnei David Academy, a famous pre-military academy in Israel, but he still did not find friends with similar interests and his own life goals. Not long after, Daniel's father reposted an article about Y Combinator (YC), an entrepreneurial project in Silicon Valley. Fundraising, but importantly, this 18-year-old boy discovered that maybe YC was the gathering place for the "outsiders" he had been looking for—so, in a deserted Israeli army camp, he used an old Nokia phone to communicate with A bulky laptop, completed the YC application, and thus unlocked a "whirlwind life journey".

In 2010, Daniel successfully passed the YC interview, arrived in Silicon Valley* (*Because "evading" service violated the local law, Daniel has never returned to Israel since then)*, founded a company called Greplin, and developed personal assistant application products - also It was at YC Demo Day that Nat noticed this particular young man.

Subsequently, Greplin received two rounds of investment from top investment institutions such as Sequoia Capital, changed its name to Cue, and was acquired by Apple for about 40 million US dollars in 2013. Daniel has since become Apple's technical director, responsible for machine learning With the search business -- and Daniel had just turned 23, and it all happened so quickly.

From the youngest founder supported by YC and Sequoia to being acquired by Apple, Daniel firmly believes that the first step to success is to find a community composed of "outsiders", and the second step is to find "outsiders" who dare to be unknown. "Outsiders" who bet, that is, early investors. **Thus, Daniel began to explore the field of early investment. Since 2013, he has successively invested in Uber, GitHub, Coinbase, Instacart, Opendoor, Airtable, Figma, Gusto, Notion, Cruise and other companies as an individual investor , This is indeed a brilliant report card.

But Daniel officially embarked on the road of early investment in January 2017-he resigned from all other positions and returned to YC as a partner, not only investing in the field of artificial intelligence, but also integrating artificial intelligence technology into this institution In July of the same year, he joined Nat Friedman to co-lead the AI Grant project; in August 2018, he left YC and established Pioneer, which aims to help underdogs from all over the world start projects quickly, Find more "Lost Einsteins" [2] 」。

Self-Reflection Skills by Daniel Gross The most important skill we can develop is a natural curiosity about ourselves. Once we develop the habit of constant self-reflection, we develop an appreciation and gratitude for both good and bad experiences. I want to discuss two aspects: interaction with others and interaction with yourself.

** INTERACTIONS WITH OTHER PEOPLE Our goal in life should not be to win any one particular game, but the sum of all games. **In order to do this, we need to be good at working with others: we can't be too sharp or we'll never be invited back to the team; we can't be nervous or we'll never produce anything.

When we interact with our environment, information output is produced. We tell things, and people have opinions about what we say. Some people are insensitive to the reactions of others, which is a serious mistake, and they are throwing valuable "training data" in front of us. If we don't retrain our models based on the input of the masses, we will never converge to the truth (close to the truth), we will become those who are too loud or not loud enough, and the social group will not give us the opportunity to cooperate further. Because they predicted we wouldn't contribute.

If we want to keep getting invited back to play, to be a likable player.

Interaction with Yourself We may all have long and short term goals to achieve. There are days when we feel great, our minds are clear, and we find ourselves making good progress, and there are days that are terrible—we all have those days. The trick is to treat each day as an opportunity to learn. If at any point we feel that we are not productive enough, ask ourselves: why? What are we doing wrong? Is lunch ready? Did someone say something annoying? Did we get any bad news?

Make sure you learn from your successes, not just your failures. What are the common ingredients that bring about a good day? sleep well? good weather? If weather is a factor, should we move to a sunnier location? etc.

It is worth noting that environmental factors occasionally have delayed feedback loops. For example, I find that what I eat affects my mood after about 96 hours - making sure we have a wide enough data collection window.

Another factor that has helped me a lot is meditation. Meditation is like installing a debugger in your brain. It allows us to inspect values (value) immediately - and even change them - instead of just making our code (mind) go wrong.

I'm now "grading" my day every night, trying to dissect what went well and what didn't. I'm fascinated by this act because I see my own progress. I hope everyone tries to force this habit on themselves for a few weeks and loves it, then we become addicted to self-improvement.

Over-reflection An extreme form of self-improvement is what some people call a "chip on your shoulder" and I'm suffering from it. I will be overly self-critical. For example, I ran the New York Marathon. When I crossed the finish line, my first reaction was: "I should run faster." I always feel like I should be doing better.

This is a dangerous propellant. It can push me forward, but if it's not controlled, it's hard for me to be happy. If you share this trait, force yourself to celebrate success. We may underinvest in creating happy memories: when something good happens, take time to celebrate; do something weird and funny so we remember it; add a room to our memory palace. Finally, make sure you surround yourself with supportive friends and family (environment) who will help you unwind.

AI Grant, how to vote?

2017 to 2022 - "Distributed Artificial Intelligence Laboratory"

In March 2016, AlphaGo defeated the top human Go player Lee Sedol in a game. Then 2017 was the famous "year of deep learning framework". The research, products, entrepreneurship and investment in the field of artificial intelligence were unprecedentedly active. The influential paper "Attention Is All You Need" was also published in this year. However, the technology at that time was generally far away from truly generating commercial and social value, and basic academic research seemed to have a variety of directions, but in fact it was introverted and ungrounded.

On April 12, 2017, aigrant.org was launched along with the upsurge and problems in the industry.

** **In the beginning, Nat was the only one who ran the entire AI Grant project. His idea was very straightforward: **Like Tim, provide opportunities for people who "sleeping on the smelly old red sofa" like themselves to realize their dreams . **

Applying to the program is simple: Fill out an application form, be shortlisted to receive a $5,000 grant* (initially limited to five)* to conduct research related to open source AI technologies. The whole process only takes a few minutes to fill out the form, this is the version 1.0 of AI Grant.

So, why choose "open source artificial intelligence technology"? Nat at the time was convinced of two things:

First, open source is the foundation of countless products and ideas that started with creators getting free code over the Internet. Before open source is popularized, the first step in creating something new means building or buying a baseline infrastructure, and with the emergence of new open source projects one after another, the entry price will continue to decrease, approaching zero;

Second, artificial intelligence will be the basis for countless new products, ideas and companies in the future. From automobiles to medicine to finance to education, artificial intelligence will drive a huge wave of innovation in all walks of life. And combined with the first, open source AI technology will lower the cost of entry, allowing more, or even anyone, to participate (but still need to pay for the GPU).

But as for what is artificial intelligence, and what is research related to artificial intelligence technology, Nat has always maintained an open mind-anything that feels like artificial intelligence or contributes to the field-just as we cannot define what is "artificial intelligence" at the moment. Like the original product (AI-Native Product), no one could define what "artificial intelligence" was at that time.

Regarding the review criteria, Nat specifically mentioned two:

1 Smart people with interesting ideas useful to the world;

  1. is to pay special attention to those projects that seem absolutely impossible to be funded in other ways. **

Apparently Tim and Nat aren't the only ones willing to fund young people's futures.

Six days after the announcement of the AI Grant program, Ann Miura-Ko, founding partner of early fund Floodfgate, who also teaches at Stanford, joined and provided five additional places. She hopes to find "prime force" type of people through this funding plan, that is, people who start with open source projects and will carry out different types of exploration and even start a business in the future.

Just three days before the application deadline for the first round, tech companies joined:

  • Microsoft will provide these ten grantees with $1,000 in Azure credits redeemable for NVIDIA Tesla K80 virtual machines;

  • FloydHub will provide 250 hours of NVIDIA Tesla K80 hosting hours, Scale will provide a $1,000 manual data labeling credit, and CrowdFlower will also provide a $5,000 manual data labeling credit.

——This is not just an increase in the value of the grants received, in fact, the content of the grants has become more practical, more easily distributed and more diverse.

The first AI Grant recruitment was a great success. Nat received nearly 500 applications from 50 countries. More than 20 professional volunteers screened projects with him, and finally selected ten candidates a month later. .

In June 2017, Daniel Gross, who has been exploring how to invest in "unknown outsiders", officially joined Nat, became a project partner and positioned the project as "Distributed Artificial Intelligence Laboratory", AI Grant ushered in for another iteration:

  • Increased funding for technology companies providing infrastructure. On the basis of the previous period, Google replaced Microsoft and will provide each grantee with a virtual machine service credit of US$20,000;

  • Increased participation in the Network. In addition to the two donors, Andrej Karpathy, then director of artificial intelligence at Tesla, together with researchers from Google, formed the AI Grant expert group. At the same time, as we mentioned earlier, many professional volunteers applied to join Nat's screening team, and they also became part of the AI Grant network, working with experts to help funded researchers;

  • And despite the addition of the early fund CRV, the amount of the initial cash payout was reduced. For an early project, learning to plan resources and "doing big things with a small amount of money" is very important-$2,500 was the start-up capital required by most researchers at the time.

Since then, the funding model of AI Grant has been iteratively updated on top of this. Similar to the founders invested by Pioneer, the backgrounds of the funded here are also extremely diverse, from Africa to the United States, from high school students to researchers—although it was once Because the artificial intelligence industry is cold, it has to be released sporadically.

As of 2022, the AI Grant has funded more than 50 researchers, of which 36 have received full cash funding through two screenings. Many of them have also created their own companies, two of which have become unicorns, namely Cohere, a large language model company with a current valuation of 2.2 billion US dollars, and Cresta, an intelligent call center company with a value of 1.6 billion US dollars, and another video Helia, a real-time data processing company, was also successfully acquired by Scale.

Russell Kaplan, the founder of Helia, was the first batch of AI Grant recipients. At that time, he was about to graduate from Stanford and was studying the use of natural language to guide reinforcement learning. He built and open sourced a faster-learning deep reinforcement learning agent (agent) , and in Montezuma's Revenge [3] beats most other methods. After graduation, he initially chose to join Tesla and built Tesla's core vision model, HydraNet, a large-scale multi-task neural network, but less than two years later, he co-founded with Ashwin Sreenivas from Palantir and Daniel Berrios from Goldman Sachs Computer vision company Helia, aimed at real-time processing of video information data, was sold to Scale at the end of the following year.

In the same year, the founders of Cohere, Aidan Gomez and Ivan Zhang, who were funded in the second batch, were alumni of the University of Toronto. At that time, their research project was very hard-core - using generative adversarial neural networks for password cracking - which was more than 1,000 applicants at the time It is a very eye-catching existence. With the support of AI Grant, the two established For.ai to do related research. Two years later, Aidan (graduated in 2023), who had just gone to Oxford University for a Ph. D., and Ivan, who had dropped out of the University of Toronto, co-founded Cohere. For.ai is now Cohere For AI, the non-profit research center within Cohere* (*By the way, just a day after this year's AI Grant Batch 1 members were announced, Cohere For AI also started its own AI research grant program )*.

Among them, Zayd Enam, a Pakistani immigrant who was the latest to receive funding, tried Internet medical entrepreneurship in his hometown at the age of 16. Not long after receiving funding, he dropped out of Stanford Ph.D. and started a business with Tim Shi, who had just graduated with a Ph.D. and joined OpenAI for a year. Tour, established Cresta.

2022-Present - Shift to "Early Stage Venture Funds"

In 2022, the artificial intelligence boom will come again. Unlike the last time, the academic research in the field is already rich and colorful, and the related user experience and product innovation have just begun.

In an interview, Nat said: "Daniel and I spent a few years playing with GPT models and were blown away by their capabilities. I was very lucky to design and release GitHub Copilot. After that, I look forward to a series of New products — because maybe more people will go through the same process and find out that GPT-3 can do a lot of incredible things, and then think about whether this ability can be added to different products — but it didn’t happen. So by 2022 In the late summer and early fall of last year, we started to ask ourselves, where did people go? That’s why, we restarted the AI Grant, calling developers to act.

On August 31, 2022, AI Grant will be restarted again, with a much more "generous" shot, and each investor will receive a cash investment of US$250,000. It is worth mentioning that although Nat has cooperated with many technology companies, only the cloud computing quota of Microsoft Azure has always been promoted by Nat -- **from "open source artificial intelligence technology" to "AI-first products" to "AI-first products" -native products”, from researchers to entrepreneurs, the cost of GPU is always unavoidable. **

2022 Nat Friedman Promotional Twitter Image

2023 Nat Friedman Promotional Twitter Image

In fact, starting from 2020, although Nat and Daniel still appear in the investor lists of various companies in the name of individual investors, they have already quietly raised a venture capital fund C2 Investments with a total amount of about 1.1 billion US dollars and two other investors. Small funds CTRY and ND2100 with a total investment of approximately US$142 million, and through them invest in startups related to artificial intelligence and infrastructure. As part of the investment strategy of the two, AI Grant has also officially completed the transformation from a non-profit organization to a venture capital institution, and is committed to investing in earlier AI-Native products.

As early investors in the vertical field of artificial intelligence, Nat and Daniel's investment strategy is more pragmatic, and they have made great efforts in infrastructure construction and support:

  • At the beginning of 2023, Nat built nat.dev, a platform that aggregates almost all common language models on the market, making it easy to try and compare different language models;

  • In June 2023, Nat and Daniel acquired 2,512 NVIDIA Tesla H100 server chips (worth about $100 million, about half the size of NVIDIA's in-house supercomputer) to form the Andromeda Cluster and will invest in them Open to startups — meaning these small startups will have access to computing resources that only well-funded larger companies can afford.

AI-Native How to vote and what to vote for?

**A basic question: how to define Nat and Daniel, and how to screen AI-native products? **

As an important reference, the answer given by the AI Grant official website is as follows: "Any product that utilizes artificial intelligence models in a useful or interesting way. In particular, we are looking for technical and pragmatic founders who can build great products. If you’re excited about something other people enjoy using, and understand that building something new is only 1% of the idea and 99% of the iteration, then we want to support you.”

"Any product that leverages an AI model in a useful or interesting way" — again, the duo kept it open. In fact, although there is no clear scope, from their interviews, project investment, members of AI Grant Batch 1 and even the "Vesuvius Challenge" previously launched, their preferences for artificial intelligence products and even their attitudes towards technology use can be seen One spot.

C2 Investments' Invested Companies

2017|Retool 🦄️

  • Location - San Francisco, USA

  • Direction - no-code building tools for commercial software in enterprises

  • Founder - David Hsu, BSc in Philosophy and Computing from Oxford University in 2017

  • Investment time - 2017 (5 consecutive betting rounds until 2022)

  • Other Investors - Patrick Collison, John Collison, Elad Gil, YC, Sequoia, etc.

2022|Keen

  • Location - San Francisco, USA

  • Orientation - Artificial General Intelligence (AGI)

  • Founder - John Carmack, co-founded id Software in 1990, lead programmer on Commander Keen, Wolfenstein 3D, Doom, Quake and their sequels; joined Oculus in 2013, CTO

  • Timeframe for investment - 2022

  • Other Investors - Patrick Collison, Tobi Lutke, Sequoia, Capital Factory

2022|ElevenLabs

  • Location - London, UK

  • Directions - Voice Cloning and Generation

  • Founder - Piotr Dabkowski, graduated from Oxford University with a bachelor's degree in engineering in 2016, and graduated from Cambridge University with a graduate degree in computer science in 2017; he was a software engineer at Google Zurich before leaving to start his own business in 2022. Mati Staniszewskiv, graduated from the Department of Mathematics of Imperial College, UK, will be a deployment strategist for Palantir before leaving to start a business in 2022

  • Time of investment - 2023 - Other investors - a16z, SVA, Guillermo Rauch, etc.

2023|Lexica-Location-San Francisco, USA-Direction-Image Search and Generation Tools-Founder-Sharif Shameem, graduated from the University of Maryland in 2019, founded the P2P cloud game company Vectordash in the same year; established the language model in 2022 Driven low-code tools company Debuild - Time to invest - 2022 - Other investors - AI Grant

26 members of AI Grant Batch 1

Batch 1 member companies not only have diverse product directions, but also have diverse backgrounds of founders, including young people who have just graduated from college (Flair, WOMBO) and experienced serial entrepreneurs (Replicate, Chroma). Most of these outstanding products have been introduced in the previous Newletter. Due to the length of the article, we will not describe each company in detail here, but only list a brief introduction and URL:

infrastructure

Replicate - Cloud infrastructure for machine learning models

🔗

Chroma - Open source embedded database (more colloquially, programmable memory)

🔗

Application layer

🔠 Letters

Perplexity - Search Tool

🔗

ValueBase - Asset Valuation Modeling Tool for Government

🔗

Sameday - Appointment Scheduling Tool for Marketers

🔗

Ghostwrite - Automated email writing tool

🔗

Samaya AI - Knowledge Discovery Platform for Financial Services

🔗

Forefront - Enterprise Chatbot

🔗

Dust - Assistant for teamwork

🔗

Circle Labs - Discord contact generation

🔗 (The website is extremely crude, but I really like it!!)

🎨 Vision

Lexica.art - Image search and generation tool

🔗

Recraft - vector graphics and 3D model generation tool

🔗

Flair - Tool for branding content design (mainly product and model graphics)

🔗

Poly - Texture Generation Tool

🔗

WOMBO - Lip Sync video generation tool for consumers

🔗

Sieve - Video processing, understanding and search API cloud platform

🔗

Vizcom - Engineering/design drawing generation tool

🔗

Secret Weapons - Video tools for the film industry

🔗

Pixelcut - Product Photo Generator

🔗

AniML - NeRF-based product video generation tool

🔗

💻 Code

Cursor - code editing tool

🔗

Rowy - Low-code backend

🔗

🎙️ Voice

Play.ht - Speech generation and cloning

🔗

♾️ Multimodal and more

Animato (Call Annie) - video chat with virtual characters

🔗

Brich - Automation of Call Center Operations in High Compliance Industries

🔗

Minion.ai - Automated browser assistant* (product not released yet)*

🔗

Vesuvius Challenge - Artificial Intelligence for Human Civilization

If venture capital funds and AI Grant are Nat and Daniel's investment in promoting the progress of the business world with artificial intelligence, then the two and Brent Seales, a professor of computer science at the University of Kentucky and one of the co-founders of scrollprize.org, will be in 2023 The Vesuvius Challenge (Vesuvius Challenge), which was jointly launched in March 2009, is their exploration to use artificial intelligence to promote the development of human civilization.

The challenge asked entrants to read two unfurled scrolls (the Herculaneum Papyrus) that were carbonized and buried under 20 meters of earth and volcanic ash following the eruption of Mount Vesuvius in AD 79 It is undoubtedly a daunting task. The competition builds on work already done by third sponsor Brent Seales, back in 2015, when he and his team used X-ray tomography and computer vision to "read" the carbonized state of the carbon dioxide found in Israel's Dead Sea region. The N’Goldi Scroll – which presents the biblical text it contains without opening it. However, reading the contents of the Herculaneum papyrus is more challenging: Unlike the denser ink used in the En Gedi scrolls, the Herculaneum ink is carbon-based, while the papyrus is carbon-based. Paper, the two will not contrast in X-rays.

At the same time, the implications of this task for the study of human history are equally enormous - in fact, the amount of ancient literature possessed by humanity might increase if we could fully unfold and read the existing 1,814 scrolls and fragments more than double. However, many of them have been damaged by a large number of unreasonable unfolding attempts before. Except for some Greek philosophical scrolls, which took an Italian monk to carefully unfold and organize for decades, there are more than 600 scrolls that have not been unfolded.

According to the competition's official website, the $1 million grand prize will be awarded to the first team to make any of the fully scanned scrolls readable by 11:59 p.m. Presented in the form of a sketch image, the text is visible and clear, and needs to be accompanied by a detailed technical description of how the solution is reproducible and feasible.

However, there is an additional clause in the competition: "Reduce hallucinations." If there is any risk of hallucinatory results from the team's model, explain how this risk was mitigated in practice, and justify why the submitter himself is confident that the results he obtained are real.

**This is undoubtedly a good opportunity to use new technology to unlock the ancient secrets of mankind. **In addition to the numerous contestants, the rising donations also show us the enthusiasm of various groups for using new technologies to promote the development of human civilization: within a few days of the task release, including Stripe founder Collison brothers, Shopify founder Tobi Nearly 20 entrepreneurs, investors and anonymous individuals, including Lutke and Wordpress founder Matt Mullenweg, joined the list of donors, and the prize money of the competition also quadrupled 👇

Also, it's worth noting that even at the competition, Nat and Daniel are champions of the open source spirit that's rooted in their hearts. "All the organizers of the Vesuvius Challenge strongly believe in open source and incremental progress. We want to encourage open construction and benefit the community as a whole — something that is often inhibited in competition," the competition's official website reads. The competition featured three additional open source prizes worth $2,000.

About the development cycle of artificial intelligence products

In an interview with Ben Thompson in March 2023, regarding the development cycle of AI-native products, Nat put forward this point of view: **Due to the mature network infrastructure, the proliferation speed of AI products will be twice as fast as that of the previous generation of Internet products, but we still need time to figure out what real AI-native products look like, not just improving existing workflows and software. *His several interesting specific expressions are as follows (*I don’t think anyone can predict the future after two years and beyond, here is just for reference)*:

  • Even if researchers stop here and no longer iterate and increase functions, we still need five to ten years to digest the capabilities of GPT-4 and other advanced models and transform them into products. There are so many variations and variants, workflows, and user experiences that need to be invented, reinvented, or permuted, and we're just scratching the surface, trying to bundle these capabilities into existing products.

  • The operating system needs to be rebuilt around the capabilities of artificial intelligence. Different start-ups have demonstrated that artificial intelligence can have different surreal capabilities, and we can rebuild the entire computing platform in ten years. The current state in the field is that researchers are at the forefront, and there is still a lot of digestion work to be done at the commercial level, and it is difficult to accelerate this process.

  • The capabilities of artificial intelligence will not stop here, they will continue to develop, and the development trend of the past two years will likely continue. These are large steps of progress. So, even if we do have a native product design for AI capabilities in 2023, we may find ourselves dealing with completely different capabilities and tools at the same time in 2024 - this is a whole new wave of technology that will take time to digest for the product.

Based on this, he also raised a question: If the land (infrastructure) we step on has been changing rapidly, where do we choose to bet?

His answer was this: **To really excel in artificial intelligence today, you must have a deeper understanding of it. **Different from more than ten years ago, starting a business is now a very common thing. The selection effect in Silicon Valley is decreasing. With more people in the pool, it is even more difficult to choose a good direction, especially artificial intelligence. A popular direction, for any entrepreneur, starting a business in this direction will be even more difficult.

But don't be too pessimistic - **When did we really realize that the Internet has become an industry? After the bubble burst. **

Written at the end: True AI Grant

As mentioned earlier, the "investment experiment" of Nat Friedman and Daniel Gross also brought me a lot of inspiration and motivation. Therefore, starting from August 1st, I have also launched our AI Grant together with my boss Yusen, my colleagues in operation and marketing, and AWS partners. Although the current support is far less perfect than Nat and Daniel's two predecessors, we hope to grow together with the Chinese developer community - the introduction registration method is at the end of the article.

Einstein was a patent clerk. In Bern

With ideas many thought were crazy

Outsiders often have the weirdest and best ideas

Our goal is to find and fund them.

[1] "IRC" literally translates to Internet Relay Chat, which is an application-layer protocol that is mainly used for group chat, but it can also be used for person-to-person chat.

[2] The term "Lost Einstein" comes from Raj Chetty's study: Despite scoring the same on intelligence tests in early childhood, kids from high-income (top 1%) families were just as likely to be inventors as those from below-middle-income families Ten times as much as a child. "Lost Einstein" was used by Raj to refer to a low-income genius who could have done great things if given the opportunity in the right way.

[3] "Montezuma's Revenge" is an Atari game that represents a broad class of challenging real-world problems known as "hard-exploration problems" in environments with sparse feedback, That is, artificial intelligence models/agents need to learn complex tasks and pass levels through few or deceptive feedback, so it is regarded as a challenge of reinforcement learning.

Reference List|References

View Original
  • Reward
  • 1
  • Share
Comment
No comments