Friday, November 22, 2024

Adobe Develops SlimLM That Can Process Documents Locally on Devices Without Internet Connectivity

Date:

Share post:


Adobe researchers have published a paper that details a new artificial intelligence (AI) model capable of processing documents locally on a device. Published last week, the paper highlights that researchers experimented with existing large language models (LLMs) and small language models (SLMs) to find how to reduce the size of the AI model while keeping its processing capability and inference speed high. The researchers, as a result of the experimentations, were able to develop an AI model dubbed SlimLM that can function entirely within a smartphone and process documents.

Adobe Researchers Develop SlimLM

AI-powered document processing, which allows a chatbot to answer user queries about its content, is an important use case of generative AI. Many companies, including Adobe, have tapped into this application and have released tools that offer this functionality. However, there is one issue with all such tools — the AI processing takes place on the cloud. On-server processing of data raises concerns about data privacy and makes processing documents containing sensitive information a risk-ridden process.

The risk mainly emerges from fears that the company offering the solution might train the AI on it, or a data breach incident could cause the sensitive information to be leaked. As a solution, Adobe researchers published a paper in the online journal arXiv, detailing a new AI model that can carry out document processing entirely on the device.

Dubbed SlimLM, the AI model’s smallest variant contains just 125 million parameters which makes it feasible to be integrated within a smartphone’s operating system. The researchers claim that it can operate locally, without needing Internet connectivity. As a result, users can process even the most sensitive documents without any fear as the data never leaves the device.

In the paper, the researchers highlighted that they conducted several experiments on a Samsung Galaxy S24 to find the balance between parameter size, inference speed, and processing speed. After optimising it, the team pre-tained the model on SlimPajama-627B foundation model and fine-tuned it using DocAssist, a specialised software for document processing.

Notably, arXiv is a pre-print journal where publishing does not require peer reviews. As such, the validity of the claims made in the research paper cannot be ascertained. However, if true, the AI model could be shipped with Adobe’s platforms in the future.


LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

Related articles

Microsoft Is Reportedly Exploring Free Xbox Game Pass Access in Exchange for Watching Adverts

Microsoft is reportedly planning to offer free game streaming in exchange for advertisements. As per TweakTown, Xbox...

HMD Partners With Xplora to Make Phones Focused on Children and Teenagers

HMD announced a strategic partnership with Norwegian-based Xplora, which offers smartwatches for kids. The collaboration is aimed at...

Asus ROG Zephyrus G14 (2024) With AMD Ryzen 9 Processor, Nvidia RTX 4070 GPU Launched in India

Asus ROG Zephyrus G14 (2024) was launched in India on Wednesday and the latest gaming laptop from...

‘Reporting to my brother’: Hardik Panyda joins Krunal Pandya to play in Syed Mushtaq Ali Trophy | Cricket News

Hardik Pandya and Krunal Pandya (X Photo) NEW DELHI: The Syed Mushtaq Ali Trophy, which begins...