UNI-MB - logo
UMNIK - logo
 

Search results

Basic search    Expert search   

Currently you are NOT authorised to access e-resources UM. For full access, REGISTER.

1 2 3 4 5
hits: 67
1.
  • On Capacity of Downlink Und... On Capacity of Downlink Underwater Wireless Optical MIMO Systems With Random Sea Surface
    Zhang, Huihui; Dong, Yuhan; Hui, Like IEEE communications letters, 2015-Dec., 2015-12-00, 20151201, Volume: 19, Issue: 12
    Journal Article
    Peer reviewed

    In this letter, we focus on the relationship among channel capacity, signal-to-noise ratio (SNR), water types, wind speed, and characteristics of transmiter/receiver array such as inter-spacing and ...
Full text
2.
  • Angle of Arrival Analysis f... Angle of Arrival Analysis for Underwater Wireless Optical Links
    Zhang, Huihui; Hui, Like; Dong, Yuhan IEEE communications letters, 2015-Dec., 2015-12-00, 20151201, Volume: 19, Issue: 12
    Journal Article
    Peer reviewed

    In underwater wireless optical communications (UWOC), each emitted photon will be scattered with a random deviated angle when propagating through underwater channel. Then the light beam observed at ...
Full text
3.
  • ReLU soothes the NTK condition number and accelerates optimization for wide neural networks
    Liu, Chaoyue; Like Hui arXiv (Cornell University), 05/2023
    Paper, Journal Article
    Open access

    Rectified linear unit (ReLU), as a non-linear activation function, is well known to improve the expressivity of neural networks such that any continuous function can be approximated to arbitrary ...
Full text
4.
  • Evaluation of Neural Architectures Trained with Square Loss vs Cross-Entropy in Classification Tasks
    Like Hui; Belkin, Mikhail arXiv (Cornell University), 10/2021
    Paper, Journal Article
    Open access

    Modern neural architectures for classification tasks are trained using the cross-entropy loss, which is widely believed to be empirically superior to the square loss. In this work we provide evidence ...
Full text
5.
  • Surprising Empirical Phenom... Surprising Empirical Phenomena of Deep Learning and Kernel Machines
    Hui, Like 01/2023
    Dissertation

    Over the past decade, the field of machine learning has witnessed significant advancements in artificial intelligence, primarily driven by empirical research. Within this context, we present various ...
Full text
6.
  • Cut your Losses with Squentropy
    Like Hui; Belkin, Mikhail; Wright, Stephen arXiv.org, 02/2023
    Paper, Journal Article
    Open access

    Nearly all practical neural models for classification are trained using cross-entropy loss. Yet this ubiquitous choice is supported by little theoretical or empirical evidence. Recent work (Hui & ...
Full text
7.
  • Limitations of Neural Collapse for Understanding Generalization in Deep Learning
    Like Hui; Belkin, Mikhail; Nakkiran, Preetum arXiv (Cornell University), 02/2022
    Paper, Journal Article
    Open access

    The recent work of Papyan, Han, & Donoho (2020) presented an intriguing "Neural Collapse" phenomenon, showing a structural property of interpolating classifiers in the late stage of training. This ...
Full text
8.
  • Joint Training of Complex Ratio Mask Based Beamformer and Acoustic Model for Noise Robust Asr
    Xu, Yong; Weng, Chao; Hui, Like ... ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
    Conference Proceeding

    In this paper, we present a joint training framework between the multi-channel beamformer and the acoustic model for noise robust automatic speech recognition (ASR). The complex ratio mask (CRM), ...
Full text
9.
  • Kernel Machines Beat Deep Neural Networks on Mask-based Single-channel Speech Enhancement
    Like Hui; Ma, Siyuan; Belkin, Mikhail arXiv (Cornell University), 11/2018
    Paper, Journal Article
    Open access

    We apply a fast kernel method for mask-based single-channel speech enhancement. Specifically, our method solves a kernel regression problem associated to a non-smooth kernel function (exponential ...
Full text
10.
  • Convolutional maxout neural networks for speech separation
    Like Hui; Meng Cai; Cong Guo ... 2015 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), 12/2015
    Conference Proceeding

    Speech separation based on deep neural networks (DNNs) has been widely studied recently, and has achieved considerable success. However, previous studies are mostly based on fully-connected neural ...
Full text
1 2 3 4 5
hits: 67

Load filters