This article presents a hpr‐adaptive crack propagation method for highly accurate 2D crack propagation paths which requires no a priori knowledge of the tip solution. The propagation method is ...designed to be simple to implement, only hr‐adaptivity is required, with the propagation step size independent of the initial mesh allowing users to obtain high fidelity crack path predictions for domains containing multiple cracks propagating at different rates. The proposed method also includes a crack path derefinement scheme, where elements away from the crack tip are derefined whilst elements close to the crack tip are small so capture the fidelity of the crack path. The result is that the propagation of cracks over an increasingly larger distances has negligible increased computational effort and effect on the propagation path prediction. The linear elastic problem is solved using the hp discontinuous Galerkin symmetric interior penalty finite element method, which is post‐processed to obtain the configurational force at each tip to a user defined accuracy. Several numerical examples are used to demonstrate the accuracy, efficiency, and capability of the method. Due to the method's high accuracy crack path solutions of benchmark problems that are prolifically used in the literature are challenged.
This paper proposes the notion of model adaptivity for fluid flow modelling, where the underlying model (the governing equations) is adaptively changed in space and time. Specifically, this work ...introduces a hybrid and adaptive coupling of a 3D bulk fluid flow model with a 2D thin film flow model. As a result, this work extends the applicability of existing thin film flow models to complex scenarios where, for example, bulk flow develops into thin films after striking a surface. At each location in space and time, the proposed framework automatically decides whether a 3D model or a 2D model must be applied. Using a meshless approach for both 3D and 2D models, at each particle, the decision to apply a 2D or 3D model is based on the user-prescribed resolution and a local principal component analysis. When a particle needs to be changed from a 3D model to 2D, or vice versa, the discretization is changed, and all relevant data mapping is done on-the-fly. Appropriate two-way coupling conditions and mass conservation considerations between the 3D and 2D models are also developed. Numerical results show that this model adaptive framework shows higher flexibility and compares well against finely resolved 3D simulations. In an actual application scenario, a 3 factor speed up is obtained, while maintaining the accuracy of the solution.
•Novel notion of model adaptivity: governing equations are adaptively changed.•Coupling 3D Navier–Stokes and 2D thin film flow.•No a-priori information describing where to use which model.•Model and discretization changed on-the-fly.
The nonlinear Schrödinger equation (NLSE) is one of the most important equations in quantum mechanics, and appears in a wide range of applications including optical fibre communications, plasma ...physics and biomolecule dynamics. It is a notoriously difficult problem to solve numerically as solutions have very steep temporal and spatial gradients. Adaptive moving mesh methods (r-adaptive) attempt to optimise the accuracy obtained using a fixed number of nodes by moving them to regions of steep solution features. This approach on its own is however limited if the solution becomes more or less difficult to resolve over the period of interest. Adaptive mesh refinement (h-adaptive), where the mesh is locally coarsened or refined, is an alternative adaptive strategy which is popular for time-independent problems. In this paper, we consider the effectiveness of a combined method (hr-adaptive) to solve the NLSE in one space dimension. Simulations are presented indicating excellent solution accuracy compared to other moving mesh approaches. The method is also shown to control the spatial error based on the user’s input error tolerance. Evidence is also presented indicating second-order spatial convergence using a novel monitor function to generate the adaptive moving mesh.
In this paper we present a basis selection method that can be used with ℓ1-minimization to adaptively determine the large coefficients of polynomial chaos expansions (PCE). The adaptive construction ...produces anisotropic basis sets that have more terms in important dimensions and limits the number of unimportant terms that increase mutual coherence and thus degrade the performance of ℓ1-minimization. The important features and the accuracy of basis selection are demonstrated with a number of numerical examples. Specifically, we show that for a given computational budget, basis selection produces a more accurate PCE than would be obtained if the basis were fixed a priori. We also demonstrate that basis selection can be applied with non-uniform random variables and can leverage gradient information.
This article provides a comprehensive review of current practices and methodologies within the field of learning analytics, structured around a dedicated closed-loop framework. This framework ...effectively integrates various aspects of learning analytics into a cohesive framework, emphasizing the interplay between data collection, processing and analysis, as well as adaptivity and personalization, all connected by the learners involved and underpinned by educational and psychological theory. In reviewing each step of the closed loop, the article delves into the advancements in data collection, exploring how technological progress has expanded data collection methods, particularly focusing on the potential of multimodal data acquisition and how theory can inform this step. The processing and analysis step is thoroughly reviewed, highlighting a range of methods including machine learning and AI, and discussing the critical balance between prediction accuracy and interpretability. The adaptivity and personalization step examines the current state of research, underscoring significant gaps and the necessity for theory-informed, personalized learning interventions. Overall, the article underscores the importance of interdisciplinarity in learning analytics, advocating for the integration of insights from various fields to address challenges such as ethical data usage and the creation of quality learning experiences. This framework and review aim to guide future research and practice in learning analytics, promoting the development of effective, learner-centric educational environments driven by balancing data-driven insights and theoretical understanding.
•Introduces the Closed-Loop Learning Analytics Framework•Emphasizes integration of multimodal data for deeper learning insights•Reviews advancements in data collection, processing, and personalization•Emphasizes the impact of AI on adaptive and personalized learning experiences•Stresses the role of educational and psychological theory for learning analytics
We propose a method that morphs high-order meshes such that their boundaries and interfaces coincide/align with implicitly defined geometries. Our focus is particularly on the case when the target ...surface is prescribed as the zero isocontour of a smooth discrete function. Common examples of this scenario include using level set functions to represent material interfaces in multimaterial configurations, and evolving geometries in shape and topology optimization. The proposed method formulates the mesh optimization problem as a variational minimization of the sum of a chosen mesh-quality metric using the Target-Matrix Optimization Paradigm (TMOP) and a penalty term that weakly forces the selected faces of the mesh to align with the target surface. The distinct features of the method are use of a source mesh to represent the level set function with sufficient accuracy, and adaptive strategies for setting the penalization weight and selecting the faces of the mesh to be fit to the target isocontour of the level set field. We demonstrate that the proposed method is robust for generating boundary- and interface-fitted meshes for curvilinear domains using different element types in 2D and 3D.
Display omitted
•Implicit high-order meshing using boundary and interface fitting.•Approach targets applications where the target surface is prescribed implicitly using level-set functions.•r-adaptivity method is demonstrated to be robust at adapting easy-to-generate meshes to curvilinear boundaries and interfaces.
Constructing efficient and accurate parametrizations of subgrid‐scale processes is a central area of interest in the numerical modelling of geophysical fluids. Using a modified version of the ...two‐level Lorenz '96 model, we present here a proof of concept of a scale‐adaptive parametrization constructed using statistical mechanical arguments. By suitable use of the Ruelle response theory and the Mori–Zwanzig projection method, it is possible to derive explicitly a parametrization for the fast variables that translates into deterministic, stochastic and non‐Markovian extra terms in the equations of motion for the variables of interest. We show that our approach is computationally parsimonious and has great flexibility, as it is explicitly scale‐adaptive, and we prove that it is competitive compared with empirical ad‐hoc approaches. While the parametrization proposed here is universal and can easily be adapted analytically to changes in parameter values by a simple rescaling procedure, the parametrization constructed with the ad‐hoc approach needs to be recomputed each time the parameters of the systems are changed. The price we pay for the higher flexibility of the method proposed here is having a lower accuracy in each individual case.
Using a modified version of the two‐level Lorenz '96 model, we present here a proof of concept of a scale‐adaptive parametrization constructed using statistical mechanical arguments featuring deterministic, stochastic and non‐Markovian contributions to the equations of motion of the variables of interest. We show how the parametrization is computationally parsimonious and has great flexibility, as it is explicitly scale‐adaptive, and we prove that it is competitive compared with empirical ad hoc approaches.