Akademska digitalna zbirka SLovenije - logo

Search results

Basic search    Expert search   

Currently you are NOT authorised to access e-resources SI consortium. For full access, REGISTER.

1 2 3 4 5
hits: 1,892
1.
  • Generalization and Expressi... Generalization and Expressivity for Deep Nets
    Lin, Shao-Bo IEEE transaction on neural networks and learning systems, 05/2019, Volume: 30, Issue: 5
    Journal Article
    Open access

    Along with the rapid development of deep learning in practice, theoretical explanations for its success become urgent. Generalization and expressivity are two widely used measurements to quantify ...
Full text
Available for: IJS, NUK, UL

PDF
2.
  • Construction of Deep ReLU N... Construction of Deep ReLU Nets for Spatially Sparse Learning
    Liu, Xia; Wang, Di; Lin, Shao-Bo IEEE transaction on neural networks and learning systems, 10/2023, Volume: 34, Issue: 10
    Journal Article

    Training an interpretable deep net to embody its theoretical advantages is difficult but extremely important in the community of machine learning. In this article, noticing the importance of spatial ...
Full text
Available for: IJS, NUK, UL
3.
  • Limitations of shallow nets... Limitations of shallow nets approximation
    Lin, Shao-Bo Neural networks, October 2017, 2017-Oct, 2017-10-00, 20171001, Volume: 94
    Journal Article
    Peer reviewed

    In this paper, we aim at analyzing the approximation abilities of shallow networks in reproducing kernel Hilbert spaces (RKHSs). We prove that there is a probability measure such that the achievable ...
Full text
Available for: GEOZS, IJS, IMTLJ, KILJ, KISLJ, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UL, UM, UPCLJ, UPUK, ZRSKP
4.
  • Random Sketching for Neural... Random Sketching for Neural Networks With ReLU
    Wang, Di; Zeng, Jinshan; Lin, Shao-Bo IEEE transaction on neural networks and learning systems, 2021-Feb., 2021-02-00, 2021-2-00, 20210201, Volume: 32, Issue: 2
    Journal Article

    Training neural networks is recently a hot topic in machine learning due to its great success in many applications. Since the neural networks' training usually involves a highly nonconvex ...
Full text
Available for: IJS, NUK, UL
5.
  • Distributed Learning With D... Distributed Learning With Dependent Samples
    Sun, Zirui; Lin, Shao-Bo IEEE transactions on information theory, 2022-Sept., 2022-9-00, Volume: 68, Issue: 9
    Journal Article
    Peer reviewed
    Open access

    This paper focuses on learning rate analysis of distributed kernel ridge regression (DKRR) for strong mixing sequences. Using a recently developed integral operator approach and a classical ...
Full text
Available for: IJS, NUK, UL
6.
  • Distributed Kernel-Based Gr... Distributed Kernel-Based Gradient Descent Algorithms
    Lin, Shao-Bo; Zhou, Ding-Xuan Constructive approximation, 04/2018, Volume: 47, Issue: 2
    Journal Article
    Peer reviewed

    We study the generalization ability of distributed learning equipped with a divide-and-conquer approach and gradient descent algorithm in a reproducing kernel Hilbert space (RKHS). Using special ...
Full text
Available for: EMUNI, FIS, FZAB, GEOZS, GIS, IJS, IMTLJ, IZUM, KILJ, KISLJ, MFDPS, NLZOH, NUK, OBVAL, OILJ, PILJ, PNG, SAZU, SBCE, SBJE, SBMB, SBNM, UKNU, UL, UM, UPUK, VKSCE, ZAGLJ
7.
  • Synaptic Suppression Triple... Synaptic Suppression Triplet‐STDP Learning Rule Realized in Second‐Order Memristors
    Yang, Rui; Huang, He‐Ming; Hong, Qing‐Hui ... Advanced functional materials, January 31, 2018, Volume: 28, Issue: 5
    Journal Article
    Peer reviewed

    The synaptic weight modification depends not only on interval of the pre‐/postspike pairs according to spike‐timing dependent plasticity (classical pair‐STDP), but also on the timing of the preceding ...
Full text
Available for: BFBNIB, FZAB, GIS, IJS, KILJ, NLZOH, NUK, OILJ, SBCE, SBMB, UL, UM, UPUK
8.
  • Depth Selection for Deep Re... Depth Selection for Deep ReLU Nets in Feature Extraction and Generalization
    Han, Zhi; Yu, Siquan; Lin, Shao-Bo ... IEEE transactions on pattern analysis and machine intelligence, 2022-April-1, 2022-04-00, 2022-4-1, 20220401, Volume: 44, Issue: 4
    Journal Article
    Peer reviewed
    Open access

    Deep learning is recognized to be capable of discovering deep features for representation learning and pattern recognition without requiring elegant feature engineering techniques by taking ...
Full text
Available for: IJS, NUK, UL

PDF
9.
  • Fully corrective gradient b... Fully corrective gradient boosting with squared hinge: Fast learning rates and early stopping
    Zeng, Jinshan; Zhang, Min; Lin, Shao-Bo Neural networks, March 2022, 2022-Mar, 2022-03-00, 20220301, Volume: 147
    Journal Article
    Peer reviewed
    Open access

    In this paper, we propose an efficient boosting method with theoretical guarantees for binary classification. There are three key ingredients of the proposed boosting method: a fully corrective ...
Full text
Available for: GEOZS, IJS, IMTLJ, KILJ, KISLJ, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UL, UM, UPCLJ, UPUK, ZRSKP

PDF
10.
  • Learning theory of distribu... Learning theory of distributed spectral algorithms
    Guo, Zheng-Chu; Lin, Shao-Bo; Zhou, Ding-Xuan Inverse problems, 07/2017, Volume: 33, Issue: 7
    Journal Article
    Peer reviewed

    Spectral algorithms have been widely used and studied in learning theory and inverse problems. This paper is concerned with distributed spectral algorithms, for handling big data, based on a ...
Full text
Available for: NUK, UL
1 2 3 4 5
hits: 1,892

Load filters