Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Chuzo Login

    Top Cooking Websites For Food Bloggers

    Katy Perry Goes To Space!

    Facebook X (Twitter) Instagram
    Tech Empire Solutions
    • Home
    • Cloud
    • Cyber Security
    • Technology
    • Business Solution
    • Tech Gadgets
    Tech Empire Solutions
    Home » Why Elon Musk’s artificial intelligence company “open source” Grok matters, and why it doesn’t
    Technology

    Why Elon Musk’s artificial intelligence company “open source” Grok matters, and why it doesn’t

    techempireBy techempire3 Comments5 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email

    Elon Musk’s xAI released the Grok large language model in “open source” form over the weekend. The billionaire is clearly hoping to put his company on a collision course with rival OpenAI, which, despite its name, isn’t particularly open. But does releasing the code for something like Grok really contribute to the AI ​​development community? Yes and no.

    Grok is an xAI-trained chatbot with the same responsibilities as vaguely defined roles like ChatGPT or Claude: you ask it, and it answers. However, the LL.M. was given a trendy tone and extra access to Twitter data as a way to differentiate himself from the rest.

    As usual, these systems are nearly impossible to evaluate, but the general consensus seems to be that it’s competitive with previous-generation mid-range models like GPT-3.5. (Whether you think this is impressive given the short development time, or disappointing given the budget and the hype surrounding xAI, is entirely up to you.)

    Regardless, Grok is a modern and practical LLM with significant scale and capabilities, and the more the development community knows about such things, the better. The problem is that the definition of “open” involves more than just allowing one company (or billionaire) to claim the moral high ground.

    This is not the first time that the terms “open” and “open source” have been questioned or abused in the field of artificial intelligence. We’re not just talking about technical arguments, like choosing a usage license that’s not as open as others (Grok is Apache 2.0, in case you were wondering).

    First of all, artificial intelligence models are different from other software in terms of “open source”.

    If you’re making a word processor, for example, it’s relatively simple to open source it: release all the code publicly and let the community suggest improvements or make their own versions. Part of what makes the concept of open source valuable is that every aspect of the application is original or attributed to its original creator—this transparency and insistence on proper attribution is not just a byproduct of the open concept core.

    For artificial intelligence, this is arguably simply impossible because the way machine learning models are created involves a largely agnostic process in which vast amounts of training data are distilled into complex statistical representations. And its structure no one really guided, or even understood. This process cannot be inspected, audited, and improved in the same way that traditional code can—so while it is still of great value in some sense, it can never be truly open. (The standards community hasn’t even defined what open is in this context, but is actively discussing it.)

    But that hasn’t stopped AI developers and companies from designing and claiming that their models are “open,” a term that has lost much of its meaning in this context. Some people refer to their model as “open” if there is a public-facing interface or API. Some people call it “open” if they publish a paper describing the development process.

    Arguably, the closest an artificial intelligence model comes to being “open source” is when its developers release their models weight, that is, the exact properties of its neural network’s myriad nodes that perform vector math operations in precise order to complete the pattern initiated by user input. But even “open weight” models like LLaMa-2 exclude other important data, such as training data sets and processes, that are necessary to recreate the model from scratch. (Of course, some projects go further.)

    All of this is not even mentioning the fact that creating or replicating these models requires millions of dollars of computing and engineering resources, effectively limiting who can create and replicate them to companies with significant resources.

    So where does the Grok version of xAI fall into this spectrum?

    As an open weight model, anyone can download, use, modify, fine-tune or refine it. That’s very good! It appears to be one of the largest models that anyone can freely access in this way, in terms of parameters (314 billion), and if curious engineers want to test how it performs with various modifications, they can use a lot thing.

    However, the model’s size also has serious drawbacks. You’ll need hundreds of gigabytes of fast RAM to use it in this raw form. If you don’t already own a dozen Nvidia H100s in a six-figure AI inference rig, don’t bother clicking that download link.

    While the Grok is arguably competitive with some other modern models, it’s also significantly larger than them, which means it requires more resources to do the same things. There’s always a hierarchy to scale, efficiency and other metrics, and it’s still valuable, but it’s more about the raw material than the end product. It’s unclear whether this is the latest and greatest version of Grok, like the explicitly tweaked version that some people can access via X.

    Overall, releasing this data is a good thing, but it’s not as game-changing as some hoped.

    It’s also hard not to wonder why Musk would do this. Is his emerging artificial intelligence company really committed to open source development? Or is this just mud in the eyes of OpenAI, with which Musk is currently pursuing billionaire-level beef?

    If they are truly committed to open source development, this will be the first of many releases in which they hope to take community feedback into consideration, publish additional key messages, characterize the training data process, and further explain their approach. If they’re not, and this is done just so Musk can point it out in online arguments, it’s still valuable – it’s just that no one in the AI ​​world will rely on or pay attention to this stuff in the next few months . the model.

    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    techempire
    • Website

    Related Posts

    Mr. Meowski’s Bakery To Re-Locate In St. Charles MO

    Pokémon Trading Card Website Making 100k!

    Edufox

    Emerging Academic Education Platforms – Sponsored By Edufox

    GTA 6 Release Date

    Meta Announces “Edits” a New Editing Tool

    Leave A Reply Cancel Reply

    Top Reviews
    Editors Picks

    Chuzo Login

    Top Cooking Websites For Food Bloggers

    Katy Perry Goes To Space!

    Mr. Meowski’s Bakery To Re-Locate In St. Charles MO

    Legal Pages
    • About Us
    • Disclaimer
    • DMCA
    • Privacy Policy
    Our Picks

    Gateway Studios High-Tech Recording Studio To Open In Chesterfield, Missouri

    Edufox

    Emerging Academic Education Platforms – Sponsored By Edufox

    Top Reviews

    Type above and press Enter to search. Press Esc to cancel.