This study aims to produce accurate predictions of the NO2 concentrations at a specific station of a monitoring network located in the Bay of Algeciras (Spain). Artificial neural networks (ANNs) and ...sequence-to-sequence long short-term memory networks (LSTMs) were used to create the forecasting models. Additionally, a new prediction method was proposed combining LSTMs using a rolling window scheme with a cross-validation procedure for time series (LSTM-CVT). Two different strategies were followed regarding the input variables: using NO2 from the station or employing NO2 and other pollutants data from any station of the network plus meteorological variables. The ANN and LSTM-CVT exogenous models used lagged datasets of different window sizes. Several feature ranking methods were used to select the top lagged variables and include them in the final exogenous datasets. Prediction horizons of t + 1, t + 4 and t + 8 were employed. The exogenous variables inclusion enhanced the model’s performance, especially for t + 4 (ρ ≈ 0.68 to ρ ≈ 0.74) and t + 8 (ρ ≈ 0.59 to ρ ≈ 0.66). The proposed LSTM-CVT method delivered promising results as the best performing models per prediction horizon employed this new methodology. Additionally, per each parameter combination, it obtained lower error values than ANNs in 85% of the cases.
Predicting air quality is a very important task, as it is known to have a significant impact on health. The Bay of Algeciras (Spain) is a highly industrialised area with one of the largest superports ...in Europe. During the period 2017–2019, different data were recorded in the monitoring stations of the bay, forming a database of 131 variables (air pollutants, meteorological information, and vessel data), which were predicted in the Algeciras station using long short-term memory models. Four different approaches have been developed to make SO2 and NO2 forecasts 1 h and 4 h in Algeciras. The first uses the remaining 130 exogenous variables. The second uses only the time series data without exogenous variables. The third approach consists of using an autoregressive time series arrangement as input, and the fourth one is similar, using the time series together with wind and ship data. The results showed that SO2 is better predicted with autoregressive information and NO2 is better predicted with ships and wind autoregressive time series, indicating that NO2 is closely related to combustion engines and can be better predicted. The interest of this study is based on the fact that it can serve as a resource for making informed decisions for authorities, companies, and citizens alike.
The uncertainty cargo flow problem establishes a limitation in ports management where decision-making processes need accurate information of the future values. This work aims at predicting the future ...values of Ro-Ro perishable cargo flow at the Port of Algeciras Bay using a machine learning-based forecasting system. Two datasets consisting of daily records of fresh fruits and vegetables between 2010 to 2017 were analyzed. Additionally, these two--time series were pre-processed applying an exponential moving average method to obtain a smoothed version of the original ones. Multiple Linear Regression, Support Vector Machines, Long Short-Term Memory networks and an ensemble approach have been used to build a forecasting system and obtain the future values of the perishable cargo. The results of the analysis showed how this machine learning-based system achieved 14.83% better performance rate than a baseline persistence model in terms of root mean squared error in the fresh fruits dataset and 11.3% better in the vegetables one. In general, models’ average performance rates are better using the smoothed version of the times series rather than the original ones.
Machine learning methods are a powerful tool to detect workload peaks and congestion in goods inspection facilities of seaports. In this paper, a time series data of freight inspection volume at the ...Border Inspections Posts in the Port of Algeciras Bay was used to construct 4 datasets based on different sizes of autoregressive window and several machine learning and ensemble models were used to aid decision-making in the inspection process. Moreover, an aggregation/disaggregation procedure to make predictions was proposed and compared to two different prediction horizons: daily (t+1) and weekly (t+7) predictions. In general, results showed that neural networks performed better than any other model independently of the size of the autoregressive window. The result obtained by a weighted average ensemble model was better and statistically significant than any other model. Moreover, the proposed aggregation/disaggregation procedure provided better performance results and more robust in terms of variance than considering daily or weekly predictions.
In recent years, despite a decline in international trade and disruptions in the supply chain caused by COVID-19, the main container terminals in Latin America and the Caribbean (LAC) have increased ...their container volumes. This growth has necessitated significant adaptations by seaports and their authorities to meet new demands. Consequently, there has been a focused analysis on the performance, efficiency, and competitiveness, particularly their most relevant logistical aspects. In this paper, a multi-objective hybrid approach was employed. The Principal Component Analysis (PCA) technique was combined with the Technique for Order of Preference by Similarity to the Ideal Solution (TOPSIS) to rank LAC container terminals and identify operational criteria affecting efficiency. The analysis considered all input variables (berth/quay length, quay draught, yard area, number of quay cranes (portainer), number of yard cranes (trastainer), reachstacker, multicranes, daily montainer movement capacity, number of station reefer container type, number of terminals, and distance to the Panama Canal) and output variable (port performance expressed in TEUs from 2014 to 2023). The results revealed noteworthy findings for several terminals, particularly Colón, Santos, or Cartagena, which stands out as the main container port in LAC not only in annual TEUs throughput, but also in resource utilization.
The forecasting of the freight transportation, especially the short‐term case, is an important topic in the daily supply chain management. Intermodal freight transportation is subject to multiple ...complex calendar effects arising in the port environment. The use of prediction methods provides information that may be helpful as a decision‐making tool in the management and planning of operations processes in ports. This work addresses the forecasting problem on a daily basis by a novel two‐stage scheme combination to offer reliable predictions of fresh freight weight on Ro‐Ro (roll‐on/roll‐off) transport for 7 and 14 days ahead. The study compares daily forecasting with a weekly forecasting approach. The applies database preprocessing and Bayesian regularization neural networks (BRNN) in Stage I. In Stage II, an ensemble framework of the best BRNN models is used to enhance the Stage I forecasting. The results show that the models assessed are a promising tool to predict freight time series for Ro‐Ro transport.
A high number of freight inspections carried out at Border Inspection Posts (BIPs) of ports could lead to significant time delays and congestion problems within the port system, decreasing the ...efficiency of the port. Therefore, this work is focused on achieving the most accurate prediction of the daily number of goods subject to inspection at BIPs. Five prediction methods were used for this aim: multiple linear regression, seasonal autoregressive integrated moving average, generalized autoregressive conditional heteroskedasticity, artificial neural networks, and support vector regression models. Several nonlinear tests were used to study the nature of the time series and the best method was obtained by the comparison of the prediction results based on performance indexes that provide the goodness‐of‐fit. The result of this study may become a supporting tool for the prediction of the number of goods subject to inspection in BIPs of other international seaports or airports.
Hyperspectral technology has been playing a leading role in monitoring oil spills in marine environments, which is an issue of international concern. In the case of monitoring oil spills in local ...areas, hyperspectral technology of small dimensions is the ideal solution. This research explores the use of encoded hyperspectral signatures to develop automated classifiers capable of discriminating between polluted and clean water and distinguishing between various types of oil. The overall objective is to leverage these classifiers to be able to improve the performance of conventional systems that rely solely on hyperspectral imagery. The acquisition of the hyperspectral signatures of water and hydrocarbons was carried out with a spectroradiometer. The range of the spectroradiometer used in this study covers the ranges between 350–1000 (visible near-infrared) and 1000–2500 (short-wavelength infrared). This gives detailed information regarding the targets of interest. Different neural autoencoders (AEs) have been developed to reduce inputs into different dimensions, from 1 to 15. Each of these encoded sets was used to train decision tree (DT) classifiers. The results are very promising, as they show that the AE models encoded data with correlation coefficients above 0.95. The classifiers trained with the different sets provide accuracies close to 1.
This study focuses on how to determine the most relevant variables in order to estimate the hourly NO
2
concentrations in a monitoring network located in the Bay of Algeciras (Spain). For each ...station of the network, artificial neural networks and multiple linear regression have been used to compute hourly estimation models. Meteorological variables and hourly NO
2
concentrations from the nearby stations have been used as inputs, and a feature selection procedure has been applied as a previous step. The different models developed have been statistically compared. The inputs used in the best estimation model for each station were the most important to estimate each hourly NO
2
concentration level. These estimations can be a very useful resource to provide autonomous capacities as automatic decalibration detection or missing data imputation in monitoring networks. Finally, the similarities between stations, according to the relevance of variables, have been analysed with the aid of a hierarchical clustering algorithm.
This study presents a comparison between sixteen filter ranking methods applied to a real air pollution problem. Adaptations of the Minimum-Redundancy-Maximum-Relevance (mRMR) algorithm to use the ...Spearman's rank correlation, the kernel canonical correlation analysis, the Hilbert–Schmidt independence criterion, correntropy, the Pearson's correlation and the distance correlation are included among them. These methods were compared by estimating the hourly NO
2
concentrations at three monitoring stations located in the Bay of Algeciras (Spain). The estimation models were generated using Bayesian regularized artificial neural networks. Different estimation cases were tested for each ranking method. Finally, results were statistically compared to determine which filter ranking strategy produced the best performing model in each case. The proposed estimation scenarios showed how mRMR methods had better results than all the remaining methods when a small number of features was selected. However, their advantage was not so evident when the number of selected features increased. Results from the proposed mRMR methods were promising, especially in the case of the distance correlation mRMR, the kernel canonical correlation analysis mRMR and the Spearman's rank correlation mRMR. These ranking methods performed better than the original mRMR algorithm that employs mutual information internally.