E-resources
Peer reviewed
-
Wang, Hao; Dong, Guangming; Chen, Jin; Hu, Xugang; Zhu, Zhibing
Mechanical systems and signal processing, 01/2023, Volume: 182Journal Article
Display omitted •Dictionary has a deep architecture, where each deeper dictionary layer is learned from a few atoms of previous dictionary layer.•Shared sub-dictionary is added into dictionary learning for learning and removing the common features from different classes.•Dictionary is extended by the shift-invariant strategy with circulant matrix to overcome the time-shift problem of vibration signals.•DSDL is more accurate than deep learning methods in the time-varying fault diagnosis with small training sample. As the core of the Sparseland, dictionary learning has represented excellent performances in many fields, such as pattern recognition, fault diagnosis, noise reduction, image recognition and so on. Its key idea is that the data can have a good sparse representation on a specific dictionary consisting of a few basis atoms, so it demands that this specific dictionary is accurate and suitable enough to make the data sparse. Learning a good dictionary requires sufficient and comprehensive training data, while an efficient algorithm of dictionary learning is also essential. However, in many application fields, especially for the fault diagnosis, the training data is often scarce due to the cost of experimentation and time or other reasons. Thus, it’s not guaranteed that the data can have a good sparse representation on the single learned dictionary. To solve this problem, we proposed a novel dictionary learning named deep and shared dictionary learning (DSDL), which has the deep structure from deep learning and shared structure. In DSDL, the data is decomposed into several dictionary layers, where the deeper dictionary layer is learned from a few atoms of the previous layer. On the other hand, the shared structure aims to learn the common features from different classes and remove them for highlighting the class-specific features. We apply DSDL in two experimental cases of fault diagnosis under time-varying condition, and the results show that our proposed method always has better performances than other six state-of-the-art sparse representation methods. Compared to two popular deep learning methods, namely convolutional neural network (CNN) and deep belief network (DBN), DSDL is more accurate with small training sample.
![loading ... loading ...](themes/default/img/ajax-loading.gif)
Shelf entry
Permalink
- URL:
Impact factor
Access to the JCR database is permitted only to users from Slovenia. Your current IP address is not on the list of IP addresses with access permission, and authentication with the relevant AAI accout is required.
Year | Impact factor | Edition | Category | Classification | ||||
---|---|---|---|---|---|---|---|---|
JCR | SNIP | JCR | SNIP | JCR | SNIP | JCR | SNIP |
Select the library membership card:
If the library membership card is not in the list,
add a new one.
DRS, in which the journal is indexed
Database name | Field | Year |
---|
Links to authors' personal bibliographies | Links to information on researchers in the SICRIS system |
---|
Source: Personal bibliographies
and: SICRIS
The material is available in full text. If you wish to order the material anyway, click the Continue button.