Articles with "transformer models" as a keyword



Transformer Models in Healthcare: A Survey and Thematic Analysis of Potentials, Shortcomings and Risks

Sign Up to like & get
recommendations!
Published in 2024 at "Journal of Medical Systems"

DOI: 10.1007/s10916-024-02043-5

Abstract: Large Language Models (LLMs) such as General Pretrained Transformer (GPT) and Bidirectional Encoder Representations from Transformers (BERT), which use transformer model architectures, have significantly advanced artificial intelligence and natural language processing. Recognized for their ability… read more here.

Keywords: transformer models; thematic analysis; language; shortcomings risks ... See more keywords
Photo from wikipedia

Can We Quickly Learn to "Translate" Bioactive Molecules with Transformer Models?

Sign Up to like & get
recommendations!
Published in 2023 at "Journal of chemical information and modeling"

DOI: 10.1021/acs.jcim.2c01618

Abstract: Meaningful exploration of the chemical space of druglike molecules in drug design is a highly challenging task due to a combinatorial explosion of possible modifications of molecules. In this work, we address this problem with… read more here.

Keywords: learn translate; translate bioactive; transformer models; molecules transformer ... See more keywords

Applying transformer models to psychological time-series data: A step-by-step tutorial with an empirical illustration of depression trajectories.

Sign Up to like & get
recommendations!
Published in 2025 at "Psychological assessment"

DOI: 10.1037/pas0001416

Abstract: Transformer models have emerged as powerful tools for analyzing time-series data, yet their application in clinical psychology remains underexplored. With the increasing availability of high-frequency psychological data, these models offer new opportunities for time-series analysis,… read more here.

Keywords: time series; step step; transformer models; series data ... See more keywords

Transformer models as predication machines

Sign Up to like & get
recommendations!
Published in 2024 at "Discourse Processes"

DOI: 10.1080/0163853x.2024.2362038

Abstract: ABSTRACT Predication is the process by which the meaning of words are altered as a consequence of the contexts in which they appear. Kintsch provides an algorithm to capture this process. The model is based… read more here.

Keywords: predication; predication machines; transformer models; language ... See more keywords

Comparison of CNNs and Transformer Models in Diagnosing Bone Metastases in Bone Scans Using Grad-CAM

Sign Up to like & get
recommendations!
Published in 2025 at "Clinical Nuclear Medicine"

DOI: 10.1097/rlu.0000000000005898

Abstract: Purpose: Convolutional neural networks (CNNs) have been studied for detecting bone metastases on bone scans; however, the application of ConvNeXt and transformer models has not yet been explored. This study aims to evaluate the performance… read more here.

Keywords: transformer models; bone scans; bone metastases; transformer ... See more keywords

A Comprehensive Approach Toward Wheat Leaf Disease Identification Leveraging Transformer Models and Federated Learning

Sign Up to like & get
recommendations!
Published in 2024 at "IEEE Access"

DOI: 10.1109/access.2024.3438544

Abstract: Wheat is one of the most extensively cultivated crops worldwide that contributes significantly to global food caloric and protein production and is grown on millions of hectares yearly. However, diseases like brown rust, septoria, yellow… read more here.

Keywords: disease identification; transformer models; wheat; federated learning ... See more keywords

GazPNE2: A General Place Name Extractor for Microblogs Fusing Gazetteers and Pretrained Transformer Models

Sign Up to like & get
recommendations!
Published in 2022 at "IEEE Internet of Things Journal"

DOI: 10.1109/jiot.2022.3150967

Abstract: The concept of “human as sensors” defines a new sensing model, in which humans act as sensors by contributing their observations, perceptions, and sensations. This is crucial for the development of Social Internet of Things,… read more here.

Keywords: gazpne2 general; transformer models; pretrained transformer; place names ... See more keywords

RoPIM: A Processing-in-Memory Architecture for Accelerating Rotary Positional Embedding in Transformer Models

Sign Up to like & get
recommendations!
Published in 2025 at "IEEE Computer Architecture Letters"

DOI: 10.1109/lca.2025.3535470

Abstract: The emergence of attention-based Transformer models, such as GPT, BERT, and LLaMA, has revolutionized Natural Language Processing (NLP) by significantly improving performance across a wide range of applications. A critical factor driving these improvements is… read more here.

Keywords: architecture; transformer models; ropim processing; positional embedding ... See more keywords

Understanding and Characterizing Communication Characteristics for Distributed Transformer Models

Sign Up to like & get
recommendations!
Published in 2025 at "IEEE Micro"

DOI: 10.1109/mm.2025.3531323

Abstract: The transformer architecture has revolutionized many applications, such as large language models. This progress has been largely enabled by distributed training, yet communication remains a significant bottleneck. This article examines the communication behavior of transformer… read more here.

Keywords: understanding characterizing; transformer models; language; characterizing communication ... See more keywords

CoFormer: Collaborating With Heterogeneous Edge Devices for Scalable Transformer Inference

Sign Up to like & get
recommendations!
Published in 2025 at "IEEE Transactions on Computers"

DOI: 10.1109/tc.2025.3604473

Abstract: The impressive performance of transformer models has sparked the deployment of intelligent applications on resource-constrained edge devices. However, ensuring high-quality service for real-time edge systems is a significant challenge due to the considerable computational demands… read more here.

Keywords: edge devices; heterogeneous edge; inference; transformer models ... See more keywords

Advancing arabic dialect detection with hybrid stacked transformer models

Sign Up to like & get
recommendations!
Published in 2025 at "Frontiers in Human Neuroscience"

DOI: 10.3389/fnhum.2025.1498297

Abstract: The rapid expansion of dialectally unique Arabic material on social media and the internet highlights how important it is to categorize dialects accurately to maximize a variety of Natural Language Processing (NLP) applications. The improvement… read more here.

Keywords: stacking model; nlp applications; transformer models; performance ... See more keywords