Modeling and Design Enablement for Future Computing

Chien-Ting Tung

EECS Department
University of California, Berkeley
Technical Report No. UCB/EECS-2025-13
April 24, 2025

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2025/EECS-2025-13.pdf

With the continued scaling of integrated circuits and the increasing number of transistors on a chip, the future IC industry faces tremendous power consumption and extreme complexity in IC design. To respond to these two challenges of future IC, I studied new devices for low-power computing and new computer-aid design with machine learning to accelerate the IC simulation. My dissertation will discuss the physics and compact modeling of new-generation memory devices for in-memory computing. Furthermore, a machine learning compact modeling framework is proposed which can accelerate the IC simulation by 5 times and more.

In Chapter 2, I will demonstrate my work on polycrystalline ferroelectric capacitor modeling. This work studied the switching dynamics of multi-grain/domain ferroelectric by calculating the accumulated switched area in a ferroelectric capacitor. The model successfully interprets the dynamics in ferroelectric material and can be used in circuit simulation for memory design. Chapter 3 extends this model to a ferroelectric tunnel junction (FTJ) with a newly developed tunneling current model. FTJ is a memristor of which resistance depends on the stored polarization. The developed model can accurately fit experimental FTJ data. Chapter 4 discusses the compact model of ferroelectric field-effect transistor (FEFET). FEFET uses ferroelectric material as gate stack providing transistor and memory function at the same time. I developed a model that includes minor loop switching, steep switch, and inverse memory window in FEFETs with excellent fitting of the real device data. Chapter 5 discusses the physics and modeling of multi-grain/domain antiferroelectric material. Chapter 6 studies the switching dynamics of ferroelectric material at nanoscale with discrete switching behavior.

In addition to ferroelectric devices, other emerging memories are also great candidates for future computing. In Chapter 7, I show a compact model of resistive random-access memory (RRAM) which unifies the different switching mechanisms for various types of RRAMs. The model includes self-heating and disturbance effects in RRAMs and is tested with multi-level memory cell simulations. Magnetic random-access memory (MRAM) is another device I have studied. In Chapter 8, an MRAM compact model is demonstrated using a 1D Landau–Lifshitz–Gilbert (LLG) equation which keeps of physics of the full LLG equation with a speedup of 2.5 times.

From Chapter 9 to Chapter 11, I will discuss my framework of BSIM-NN, a neural network-based compact model for advanced transistors. The model uses neural networks to replace model equations of conventional transistor models. As result, it provides a generic framework for transistor modeling from FinFET, Gate-all-around (GAA) to emerging transistors such as negative capacitance FET (NCFET), which is something beyond traditional models. Moreover, the compact feature of this model can accelerate the IC simulation speed by 5 times versus the industry standard compact model. The model includes all terminal currents and charges of advanced FETs as well as the non-quasi-static effect and self-heating effect.

Chapter 12 shows a new way to do circuit simulation with physics-informed neural networks – NeuroSpice provides an alternative solution for IC simulation.

Advisor: Chenming Hu

\"Edit"; ?>


BibTeX citation:

@phdthesis{Tung:EECS-2025-13,
    Author = {Tung, Chien-Ting},
    Title = {Modeling and Design Enablement for Future Computing},
    School = {EECS Department, University of California, Berkeley},
    Year = {2025},
    Month = {Apr},
    URL = {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2025/EECS-2025-13.html},
    Number = {UCB/EECS-2025-13},
    Abstract = {With the continued scaling of integrated circuits and the increasing number of transistors on a chip, the future IC industry faces tremendous power consumption and extreme complexity in IC design. To respond to these two challenges of future IC, I studied new devices for low-power computing and new computer-aid design with machine learning to accelerate the IC simulation. My dissertation will discuss the physics and compact modeling of new-generation memory devices for in-memory computing. Furthermore, a machine learning compact modeling framework is proposed which can accelerate the IC simulation by 5 times and more.

In Chapter 2, I will demonstrate my work on polycrystalline ferroelectric capacitor modeling. This work studied the switching dynamics of multi-grain/domain ferroelectric by calculating the accumulated switched area in a ferroelectric capacitor. The model successfully interprets the dynamics in ferroelectric material and can be used in circuit simulation for memory design. Chapter 3 extends this model to a ferroelectric tunnel junction (FTJ) with a newly developed tunneling current model. FTJ is a memristor of which resistance depends on the stored polarization. The developed model can accurately fit experimental FTJ data. Chapter 4 discusses the compact model of ferroelectric field-effect transistor (FEFET). FEFET uses ferroelectric material as gate stack providing transistor and memory function at the same time. I developed a model that includes minor loop switching, steep switch, and inverse memory window in FEFETs with excellent fitting of the real device data. Chapter 5 discusses the physics and modeling of multi-grain/domain antiferroelectric material. Chapter 6 studies the switching dynamics of ferroelectric material at nanoscale with discrete switching behavior.

In addition to ferroelectric devices, other emerging memories are also great candidates for future computing. In Chapter 7, I show a compact model of resistive random-access memory (RRAM) which unifies the different switching mechanisms for various types of RRAMs. The model includes self-heating and disturbance effects in RRAMs and is tested with multi-level memory cell simulations. Magnetic random-access memory (MRAM) is another device I have studied. In Chapter 8, an MRAM compact model is demonstrated using a 1D Landau–Lifshitz–Gilbert (LLG) equation which keeps of physics of the full LLG equation with a speedup of 2.5 times.

From Chapter 9 to Chapter 11, I will discuss my framework of BSIM-NN, a neural network-based compact model for advanced transistors. The model uses neural networks to replace model equations of conventional transistor models. As result, it provides a generic framework for transistor modeling from FinFET, Gate-all-around (GAA) to emerging transistors such as negative capacitance FET (NCFET), which is something beyond traditional models. Moreover, the compact feature of this model can accelerate the IC simulation speed by 5 times versus the industry standard compact model. The model includes all terminal currents and charges of advanced FETs as well as the non-quasi-static effect and self-heating effect.

Chapter 12 shows a new way to do circuit simulation with physics-informed neural networks – NeuroSpice provides an alternative solution for IC simulation.}
}

EndNote citation:

%0 Thesis
%A Tung, Chien-Ting
%T Modeling and Design Enablement for Future Computing
%I EECS Department, University of California, Berkeley
%D 2025
%8 April 24
%@ UCB/EECS-2025-13
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2025/EECS-2025-13.html
%F Tung:EECS-2025-13