Flash memory devices represented a breakthrough in the storage industry since their inception in the mid-1980s, and innovation is still ongoing after more than 35 years ....
This paper presents a new reliability threat that affects 3D-NAND Flash memories when a read operation is performed exiting from an idle state. In particular, a temporary large increase of the fail ...bits count is reported for the layers read as first after a sequence of program/verify and a idle retention phase. The phenomenon, hereafter called Temporary Read Errors (TRE), is not due to a permanent change of cell threshold voltage between the program verify and the following read operations, but to its transient instability occurring during the idle phase and the first read operations performed on a block. The experimental analysis has been performed on off-the-shelf gigabit-array products to characterize the dependence on the memory operating conditions. The TRE is found to be strongly dependent on the page read, on the read temperature and on the time delay between the first and the second read after the idle state. To emphasize its negative impact at system-level, we have evaluated the induced performance drop on Solid State Drives architectures.
Nowadays, NAND Flash technology is everywhere, since it is the core of the code and data storage in mobile and embedded applications; moreover, its market share is exploding with Solid-State-Drives ...(SSDs), which are replacing Hard Disk Drives (HDDs) in consumer and enterprise scenarios. To keep the evolutionary pace of the technology, NAND Flash must scale aggressively in terms of bit cost. When approaching ultra-scaled technologies, planar NAND is hitting a wall: both academia researchers and industry worked to cope with this issue for several decades. Then, the 3D integration approach turned out to be the definitive alternative by eventually reaching mass production. This review paper exposes several 3D NAND Flash memory technologies, along with their related integration challenges, by showing their different layouts, scaling trends and performance/reliability features.
Solid-state drives represent the preferred backbone storage solution thanks to their low latency and high throughput capabilities compared to mechanical hard disk drives. The performance of a drive ...is intertwined with the reliability of the memories; hence, modeling their reliability is an important task to be performed as a support for storage system designers. In the literature, storage developers devise dedicated parametric statistical approaches to model the evolution of the memory’s error distribution through well-known statistical frameworks. Some of these well-founded reliability models have a deep connection with the 3D NAND flash technology. In fact, the more precise and accurate the model, the less the probability of incurring storage performance slowdowns. In this work, to avoid some limitations of the parametric methods, a non-parametric approach to test the model goodness-of-fit based on combined permutation tests is carried out. The results show that the electrical characterization of different memory blocks and pages tested provides an FBC feature that can be well-modeled using a multiple regression analysis.
Over the last 15 years, NAND Flash memories have changed our lives: Flash cards mainly in the secure digital (SD) form factor have almost completely replaced photographic films, and USB keys have ...driven floppy disks to extinction. Lately, thanks to a great tradeoff between cost and performance (i.e., write/read speed), NAND Flash technology has begun fighting against hard disk drives (HDDs) in the form of solid-state drives (SSDs).
Data randomization has been a widely adopted Flash Signal Processing technique for reducing or suppressing errors since the inception of mass storage platforms based on planar NAND Flash technology. ...However, the paradigm change represented by the 3D memory integration concept has complicated the randomization task due to the increased dimensions of the memory array, especially along the bitlines. In this work, we propose an easy to implement, cost effective, and fully scalable with memory dimensions, randomization scheme that guarantees optimal randomization along the wordline and the bitline dimensions. At the same time, we guarantee an upper bound on the maximum length of consecutive ones and zeros along the bitline to improve the memory reliability. Our method has been validated on commercial off-the-shelf TLC 3D NAND Flash memory with respect to the Raw Bit Error Rate metric extracted in different memory working conditions.
Nowadays it is hard to find an electronic device which does not use codes: for example, we listen to music via heavily encoded audio CD's and we watch movies via encoded DVD's. There is at least one ...area where the use of encoding/decoding is not so developed, yet: Flash non-volatile memories. Flash memory high-density, low power, cost effectiveness, and scalable design make it an ideal choice to fuel the explosion of multimedia products, like USB keys, MP3 players, digital cameras and solid-state disk. In ECC for Non-Volatile Memories the authors expose the basics of coding theory needed to understand the application to memories, as well as the relevant design topics, with reference to both NOR and NAND Flash architectures. A collection of software routines is also included for better understanding. The authors form a research group (now at Qimonda) which is the typical example of a fruitful collaboration between mathematicians and engineers.
Precise assessment of calcification lesions in the Aortic Root (AR) is relevant for the success of the Transcatheter Aortic Valve Implantation (TAVI) procedure. To this end, the radiologists analyze ...the Cardiac Computed Tomography (CCT) scans of patients, and detect the position and extent of the calcium deposits. In this contribution, we develop a computationally efficient High-Performance Computing (HPC) system to detect, segment, and quantify volumes of calcium in contrast-enhanced CCTs, embedding in a three-step pipeline two 3D Convolutional Neural Networks (CNN) and a threshold adaptive filter. The first step crops the images to a bounding-box around the AR keeping the original resolution, the second builds the segmentation, and the third detects and measures the volume of the calcium lesions. Our system is trained on high-resolution contrast-CCTs routinely planned for the TAVI manually annotated by expert radiologists, and evaluated on a test-set of patients with different levels of calcifications. The accuracy achieved in segmenting the AR is approximately 92% for the test-set, while the average difference of calcium lesion volumes with respect to the radiologists measurements is about 0.49 mm3. Running on a 4X NVIDIA-V100 and an 8X NVIDIA-A100 GPU systems, we achieve a remarkable inference throughput of 17 and 70 CCT/sec respectively, and a linear scaling of computing performance. Our contribution provides an HPC system suitable for hospital premises installation and is able to aid radiologists in assessing the calcification level of patients undergoing the TAVI, making this process automated, fast and more reliable.
NAND Flash memories have changed and keep changing our lives. In the past two decades, NAND-based systems, in the form of Flash cards and USB keys, have replaced films and floppy disks. But ...disruption did not stop there. Today, NAND is really ubiquitous, as it plays the role of storage element inside smartphones and tablets; even further, it is now expanding its reach because solid-state drives (SSDs), i.e., drives built with several NAND devices, are replacing hard disk drives (HDDs) in more and more applications. To fuel this continuous evolution, NAND has to remain very aggressive in terms of cost per bit. When approaching 10-nm technologies, planar NAND is running out of steam: industry and academia have worked hard on finding a solution to this problem for more than a decade. Three-dimensional integration turned out to be the most promising alternative, and it is now eventually reaching the market. This paper is about 3-D NAND Flash memories and the related integration challenges. Charge trap and floating gate 3-D technologies will be discussed with the aid of several bird's-eye views. Advanced layout techniques will thoroughly be analyzed. Finally, future scaling trends will be presented.
Flash memory devices have represented a breakthrough in storage since their inception in the mid-1980s, and innovation is still ongoing. The peculiarity of such technology is an inherent flexibility ...in terms of performance and integration density according to the architecture devised for integration. The NOR Flash technology is still the workhorse of many code storage applications in the embedded world, ranging from microcontrollers for automotive environment to IoT smart devices. Their usage is also forecasted to be fundamental in emerging AI edge scenario. On the contrary, when massive data storage is required, NAND Flash memories are necessary to have in a system. You can find NAND Flash in USB sticks, cards, but most of all in Solid-State Drives (SSDs). Since SSDs are extremely demanding in terms of storage capacity, they fueled a new wave of innovation, namely the 3D architecture. Today “3D” means that multiple layers of memory cells are manufactured within the same piece of silicon, easily reaching a terabit capacity. So far, Flash architectures have always been based on "floating gate," where the information is stored by injecting electrons in a piece of polysilicon surrounded by oxide. On the contrary, emerging concepts are based on "charge trap" cells. In summary, flash memory devices represent the largest landscape of storage devices, and we expect more advancements in the coming years. This will require a lot of innovation in process technology, materials, circuit design, flash management algorithms, Error Correction Code and, finally, system co-design for new applications such as AI and security enforcement.