It remains unclear whether repetitions leading to failure (failure training) or not leading to failure (non-failure training) lead to superior muscular strength gains during resistance exercise. ...Failure training may provide the stimulus needed to enhance muscular strength development. However, it is argued that non-failure training leads to similar increases in muscular strength without the need for high levels of discomfort and physical effort, which are associated with failure training.
We conducted a systematic review and meta-analysis to examine the effect of failure versus non-failure training on muscular strength.
Five electronic databases were searched using terms related to failure and non-failure training. Studies were deemed eligible for inclusion if they met the following criteria: (1) randomised and non-randomised studies; (2) resistance training intervention where repetitions were performed to failure; (3) a non-failure comparison group; (4) resistance training interventions with a total of ≥3 exercise sessions; and (5) muscular strength assessment pre- and post-training. Random-effects meta-analyses were performed to pool the results of the included studies and generate a weighted mean effect size (ES).
Eight studies were included in the meta-analysis (combined studies). Training volume was controlled in four studies (volume controlled), while the remaining four studies did not control for training volume (volume uncontrolled). Non-failure training resulted in a 0.6-1.3% greater strength increase than failure training. A small pooled effect favouring non-failure training was found (ES = 0.34; p = 0.02). Significant small pooled effects on muscular strength were also found for non-failure versus failure training with compound exercises (ES = 0.37-0.38; p = 0.03) and trained participants (ES = 0.37; p = 0.049). A slightly larger pooled effect favouring non-failure training was observed when volume-uncontrolled studies were included (ES = 0.41; p = 0.047). No significant effect was found for the volume-controlled studies, although there was a trend favouring non-failure training. The methodological quality of the included studies in the review was found to be moderate. Exercise compliance was high for the studies where this was reported (n = 5), although limited information on adverse events was provided.
Overall, the results suggest that despite statistically significant effects on muscular strength being found for non-failure compared with failure training, the small percentage of improvement shown for non-failure training is unlikely to be meaningful. Therefore, it appears that similar increases in muscular strength can be achieved with failure and non-failure training. Furthermore, it seems unnecessary to perform failure training to maximise muscular strength; however, if incorporated into a programme, training to failure should be performed sparingly to limit the risks of injuries and overtraining.
Rainfall-triggered shallow slope failures are very common in the western Southern Alps of New Zealand, causing widespread damage to property and infrastructure, injury and loss of life. This study ...develops a geographic information system (GIS)-based approach for shallow landslide/debris-flow susceptibility assessment. Since landslides are complex and their prediction involves many uncertainties, fuzzy logic is used to deal with uncertainties inherent in spatial analysis and limited knowledge on the relationship between conditioning factors and slope instability. A landslide inventory was compiled using data from existing catalogues, satellite imagery and field observations. Ten parameters were initially identified as the most important conditioning factors for rainfall-generated slope failures in the study area, and fuzzy memberships were established between each parameter and landslide occurrence based on both the landslide inventory and user-defined functions. Three output landslide susceptibility maps were developed and evaluated in a test area using an independent population of landslides. The models demonstrated satisfactory performance with area under the curve (AUC) varying from 0.708 to 0.727. Sensitivity analyses showed that a six-parameter model using slope angle, lithology, slope aspect, proximity to faults, soil induration, and proximity to drainage network had the highest predictive performance (AUC = 0.734). The runout path and distance of potential future landslides from the susceptible areas were also modelled based on a multiple flow direction algorithm and the topographic slope of existing debris-flow deposits. The final susceptibility map has the potential to inform regional-scale land-use planning and to prioritize areas where hazard mitigation measures are required.
Computer Vision (CV) has become increasingly important for Single-Board Computers (SBCs) due to their widespread deployment in addressing real-world problems. Specifically, in the context of smart ...cities, there is an emerging trend of developing end-to-end video analytics solutions designed to address urban challenges such as traffic management, disaster response, and waste management. However, deploying CV solutions on SBCs presents several pressing challenges (e.g., limited computation power, inefficient energy management, and real-time processing needs) hindering their use at scale. Graphical Processing Units (GPUs) and software-level developments have emerged recently in addressing these challenges to enable the elevated performance of SBCs; however, it is still an active area of research. There is a gap in the literature for a comprehensive review of such recent and rapidly evolving advancements on both software and hardware fronts. The presented review provides a detailed overview of the existing GPU-accelerated edge-computing SBCs and software advancements including algorithm optimization techniques, packages, development frameworks, and hardware deployment specific packages. This review provides a subjective comparative analysis based on critical factors to help applied Artificial Intelligence (AI) researchers in demonstrating the existing state of the art and selecting the best suited combinations for their specific use-case. At the end, the paper also discusses potential limitations of the existing SBCs and highlights the future research directions in this domain.
The increased global waste generation rates over the last few decades have made the waste management task a significant problem. One of the potential approaches adopted globally is to recycle a ...significant portion of generated waste. However, the contamination of recyclable waste has been a major problem in this context and causes almost 75% of recyclable waste to be unusable. For sustainable development, efficient management and recycling of waste are of huge importance. To reduce the waste contamination rates, conventionally, a manual bin-tagging approach is adopted; however, this is inefficient and requires huge labor effort. Within household waste contamination, plastic bags have been found to be one of the main contaminants. Towards automating the process of plastic-bag contamination detection, this paper proposes an edge-computing video analytics solution using the latest Artificial Intelligence (AI), Artificial Intelligence of Things (AIoT) and computer vision technologies. The proposed system is based on the idea of capturing video of waste from the truck hopper, processing it using edge-computing hardware to detect plastic-bag contamination and storing the contamination-related information for further analysis. Faster R-CNN and You Only Look Once version 4 (YOLOv4) deep learning model variants are trained using the Remondis Contamination Dataset (RCD) developed from Remondis manual tagging historical records. The overall system was evaluated in terms of software and hardware performance using standard evaluation measures (i.e., training performance, testing performance, Frames Per Second (FPS), system usage, power consumption). From the detailed analysis, YOLOv4 with CSPDarkNet_tiny was identified as a suitable candidate with a Mean Average Precision (mAP) of 63% and FPS of 24.8 with NVIDIA Jetson TX2 hardware. The data collected from the deployment of edge-computing hardware on waste collection trucks was used to retrain the models and improved performance in terms of mAP, False Positives (FPs), False Negatives (FNs) and True Positives (TPs) was achieved for the retrained YOLOv4 with CSPDarkNet_tiny backbone model. A detailed cost analysis of the proposed system is also provided for stakeholders and policy makers.
Currently, regional coseismic landslide hazard analyses require comprehensive historical landslide inventories as well as detailed geotechnical data. Consequently, such analyses have not been ...possible where these data are not available. A new approach is proposed herein to assess coseismic landslide hazard at regional scale for specific earthquake scenarios in areas without historical landslide inventories. The proposed model employs fuzzy logic and geographic information systems to establish relationships between causative factors and coseismic slope failures in regions with well‐documented and substantially complete coseismic landslide inventories. These relationships are then utilized to estimate the relative probability of landslide occurrence in regions with neither historical landslide inventories nor detailed geotechnical data. Statistical analyses of inventories from the 1994 Northridge and 2008 Wenchuan earthquakes reveal that shaking intensity, topography, and distance from active faults and streams are the main controls on the spatial distribution of coseismic landslides. Average fuzzy memberships for each factor are developed and aggregated to model the relative coseismic landslide hazard for both earthquakes. The predictive capabilities of the models are assessed and show good‐to‐excellent model performance for both events. These memberships are then applied to the 1999 Chi‐Chi earthquake, using only a digital elevation model, active fault map, and isoseismal data, replicating prediction of a future event in a region lacking historic inventories and/or geotechnical data. This similarly results in excellent model performance, demonstrating the model's predictive potential and confirming it can be meaningfully applied in regions where previous methods could not. For such regions, this method may enable a greater ability to analyze coseismic landslide hazard from specific earthquake scenarios, allowing for mitigation measures and emergency response plans to be better informed of earthquake‐related hazards.
Key Points
MMI, tectonics, and topography control coseismic landslide occurrence
The modeling approach incorporates fuzzy set theory in GIS
The model requires no site‐specific data inputs to perform the assessment
Although large rock avalanches are infrequent, sediment production in active orogens is dominated by such events, which may strongly influence geomorphic processes. Where rock avalanches fall onto ...glaciers, they may affect glacier behaviour and moraine formation. We outline the processes of rock avalanche initiation and motion, and show that supraglacial deposits of rock avalanche debris have distinct sedimentological and thermal properties. Laboratory experiments on the effects of such debris on ice ablation are supplemented by field data from two rock avalanches in the Southern Alps, New Zealand. Their effects are compared with those of the thinner supraglacial debris that results from small rockfalls and melt-out of englacial debris. Implications of rock–avalanche debris cover for glacier behaviour are explored using a mass-balance model of the Franz Josef Glacier in New Zealand, demonstrating a likely supraglacial rock avalanche origin for the Waiho Loop moraine, and considering the potential hazard of a large rock avalanche onto the present-day glacier.
► We summarised the knowledge of rock avalanche initiation and runout onto glaciers. ► The rock avalanches effect on ice-surface ablation is compared with melt-out debris. ► These effects are studied on two modern rock avalanches on glaciers, Southern Alps. ► Resultant effects on glacier mass balance, its dynamic and deposition are determined. ► The Franz Josef Glacier used as an example of hazard and paleoclimatic implications.
It’s been ten years since open data first broke onto the global stage. Over the past decade, thousands of programmes and projects around the world have worked to open data and use it to address a ...myriad of social and economic challenges. Meanwhile, issues related to data rights and privacy have moved to the centre of public and political discourse. As the open data movement enters a new phase in its evolution, shifting to target real-world problems and embed open data thinking into other existing or emerging communities of practice, big questions still remain. How will open data initiatives respond to new concerns about privacy, inclusion, and artificial intelligence? And what can we learn from the last decade in order to deliver impact where it is most needed? The State of Open Data brings together over 60 authors from around the world to address these questions and to take stock of the real progress made to date across sectors and around the world, uncovering the issues that will shape the future of open data in the years to come.
The conventional processes of science, and the incorporation of science into policy and practice, appear not to be resulting in improved disaster reduction solutions for communities, despite intense ...research into hazards and risk. Resilience to disasters is increased when the societal impacts of disasters are reduced. On this basis, the contribution that Disaster Risk Reduction (DRR) can make to Disaster Impact Reduction (DIR) is assessed, and it is demonstrated that reducing event risk by reducing event probability only reliably reduces community disaster impacts for events that occur frequently. Such events do not fit the UNISDR definition of a disaster. Therefore, DRR cannot reliably improve DIR. Instead, DIR can be addressed directly by way of community adaptation, based on carefully selected impact scenarios derived by community-expert-official collaborations considering a broad range of event and asset damage scenarios. Probabilistic risk is a useful tool in insurance and re-insurance, and possibly in national policy-making, but such national policies are likely to be undermined by inevitable failures of risk-based approaches at the local level. This work clarifies the common usage of “risk” as meaning either impact, or impact x probability.
•The relationship between risk and resilience is clearly defined.•Resilience can be achieved by reduction of future disaster impacts.•Reducing disaster probability does not reliably increase resilience to that event.•Resilience can be increased by scenario-based adaptation to reduce disaster impacts.
Massive rock avalanches form some of the largest landslide deposits on Earth and are major geohazards in high-relief mountains. This work reinterprets a previously reported glacial deposit in the ...Alai Valley of Kyrgyzstan as the result of an extremely long-runout, probably coseismic, rock avalanche from the Komansu River catchment. Total runout of the rock avalanche is ~28 km, making it one of the longest-runout subaerial non-volcanic rock avalanches thus far identified on Earth. This runout length appears to require a rock volume of ~20 km
3
; however, the likely source zone in the Trans Alai range likely contained just ~4 km
3
of rock, and presently, the deposit has a volume of only 3–5 km
3
; a pure rock avalanche volume of >10 km
3
is therefore impossible, so the event was much more mobile than most non-volcanic rock avalanches. Explaining this exceptional mobility is crucial for present-day hazard analysis. There is unequivocal sedimentary evidence for intense basal fragmentation, and the deposit in the Alai Valley has prominent hummocks; these indicate a rock avalanche rather than a rock-ice avalanche origin. The event occurred 5,000–11,000 yr B.P., after the region’s glaciers had begun retreating, implying that supraglacial runout was limited. Current volume—runout relationships suggest a maximum runout of ~10 km for a 4-km
3
rock avalanche. Volcanic debris avalanches, however, are more mobile than non-volcanic rock avalanches due to their much higher source water content; a rock avalanche containing a similarly high water content would require a volume of about 8 km
3
to explain the extreme runout of the Komansu event. Rock and debris avalanches can entrain large amounts of material during runout, with some doubling their initial volume. The best current explanation of the Komansu rock avalanche thus involves an initial failure of ~4 km
3
of rock debris, with high water content probably deriving from large glaciers on the edifice that subsequently entrained ~4 km
3
of valley material together with further glacial ice, resulting in a total runout of 28 km. It is as yet unclear whether glacial retreat has rendered a present-day repetition of such an event impossible.