Large Language Model for automobile

Preprint | 
10.55415/deep-2024-0004.v1
This is not the most recent version. There is anewer versionof this content available.
fei ding#*
Independent Researcher
Independent Researcher

# contributed equally to this work, * Corresponding author


Abstract

The expansion in scale has significantly raised hardware requirements, making it exceedingly challenging to deploy models on mobile devices such as smartphones and tablets.

To deploy on cars , we trained a 7-billion-parameter automobile model, which outperforms GPT-3.5 in the automotive domain.
Surpassing all models in areas such as automotive maintenance, navigation queries, and beyond.
Keywords
Subject Area
Version History
  • 01 May 2024 18:30 Version 1
Scores
 0
Rapid Rating Times: 0
· Level of Quality: -
· Level of Repeatability: -
· Level of Innovation: -
· Level of Impact: -

*Each rating ranges from 0-5

Rapid Rating
Your professional field is different from the direction of this article. Go Settings!
  • Level of Quality
    Is the publication of relevance for the academic community and does it provide important insights? Is the language correct and easy to understand for an academic in the field? Are the figures well displayed and captions properly described? Is the article systematically and logically organized?
    0.0
  • Level of Repeatability
    Is the hypothesis clearly formulated? Is the argumentation stringent? Are the data sound, well-controlled and statistically significant? Is the interpretation balanced and supported by the data? Are appropriate and state-of-the-art methods used?
    0.0
  • Level of Innovation
    Does the work represent a novel approach or new findings in comparison with other publications in the field?
    0.0
  • Level of Impact
    Does the work have potential huge impact to the related research area?
    0.0
Submit

我们使用 cookie 将您与其他用户区分开来, 并在我们的网站上为您提供更好的体验。

关闭此消息以接受 cookie 或了解如何管理您的 cookie 设置。

了解更多关于我们的隐私声明..

goTop