Journal Sciences News
World Science and Technology
February 2018
New hybrid heuristic algorithm for the clustered traveling salesman problem
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): M
February 2018
Dynamic product innovation and production decisions under quality authorization
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): Zhi Li, Jian Ni We consider the joint product innovation and production decisions of a manufacturing firm under the existence of quality authorization from a 3rd party, which has gained growing popularity in recent years. A dynamic control model is developed to analyse the effects of quality authorization. Combining the techniques of Pontryagin maximum principle and backward induction, the optimal decisions on production and investment in product innovation before and after attaining the quality authorization is analysed. The analytical solution of the optimal investment and production decisions is derived providing that the time for the firm to obtain the quality authorization is known. To fully solve the firm’s optimization problem, an iterative algorithm is then introduced to calculate the best time of attaining the quality authorization. We find that although the firm should have a continuous and incremental improvements in product quality, there can be jumps in the optimal production and investment levels. While the investment in product innovation is higher before obtaining the quality authorization than that after obtaining the quality authorization, the production is lower before obtaining the quality authorization. Moreover, the firm should attain the quality authorization sooner under a less costly the product innovation investment, a lower depreciation rate due to ageing of technology, a smaller production cost, and a lower interest rate. Finally, we compare our results with existing studies and discuss the managerial implications.
February 2018
A hybrid variable neighborhood search algorithm for the hot rolling batch scheduling problem in compact strip production
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): Biao Zhang, Quan-ke Pan, Liang Gao, Xin-li Zhang, Qing-da Chen This paper deals with a hot rolling batch scheduling (HRBS) problem arising from the compact strip production (CSP) process, which is one of the most popular production systems in the modern iron-steel industry to produce sheet strips. The HRBS problem aims to determine a sequence of the sheet strips in a predetermined number of rolling turns with the objective of minimizing average thickness change. In this paper, a mathematical model based on a comprehensive investigation is first given. Then a constructive heuristic based on the problem-specific characteristics is presented to generate initial feasible solutions. The heuristic can guarantee a minimum number of rolling turns to accommodate all the ordered sheet strips, but generally performs poorly in the objective of average thickness change. To improve the objective, a hybrid variable neighborhood search algorithm (HVNS) is proposed. In the HVNS, a thickness value permutation is used to encode the solution and four neighborhood structures are well designed. The fruit fly optimization algorithm, is integrated to improve search efficiency. Correspondingly, a neighborhood switching strategy is developed to improve local search ability. Moreover, a restart strategy based on the block swapping operator is used to help the algorithm escape from local optima. To investigate the effectiveness of the solution approach, two sets of instances are tested, including real-world instances and randomly generated instances. The performance of the proposed HVNS is evaluated by comparing with the other existing algorithms and the experimental results demonstrate that the proposed algorithm performs much better.
February 2018
Data-driven bearing fault identification using improved hidden Markov model and self-organizing map
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): Zefang Li, Huajing Fang, Ming Huang, Ying Wei, Linlan Zhang Efficient condition monitoring and fault diagnosis of bearings are of great practical significance since bearings are key elements in most rotating manufacturing machineries. In this study, a condition monitoring index of bearings is developed based on self-organizing map (SOM) in order to detect incipient bearing faults quickly. It requires low computation cost and is robust to the change of load level and motor speed, hence is quite suitable for online condition monitoring of bearings. Furthermore, a novel hybrid algorithm combining diversified gradient descent (DGD) method and Bayesian model selection (BMS) called DGD-BMS for the optimization of discrete hidden Markov model (DHMM) parameters is formulated under a general Bayesian framework. The flexibility of the DGD-BMS consists in that the algorithm can increase the diversity of the searching paths generated for DHMM parameters so that the true underlying parameters are more likely to be found out. Thus it provides an effective way to avoid trapping in one local maximum. Both simulation and industrial case study are presented to validate the proposed approach. Results show that the monitoring index can detect incipient bearing faults efficiently with 100% accuracy even under varying load levels, and the DGD-BMS method achieves on average the classification rate of 99.58%. The proposed method exhibits excellent performance compared to the conventional gradient descent (GD) and Baum-Welch (BW) methods.

Graphical abstract

image
February 2018
The capacitated hub covering location-routing problem for simultaneous pickup and delivery systems
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): Hossein Karimi In this study, a specific type of hub network topology, called hub location-routing, is presented in which the routes between the nodes assigned to a hub form a tour in this topology. The model minimizes the total cost of hub location and vehicle routing, subject to predefined travel time, hub capacity, vehicle capacity, and simultaneous pickups and deliveries. A polynomial-size mixed integer programming formulation is introduced for the single allocation type of the problems. In this paper, a set of valid inequalities is proposed for the formulation. In addition, a tabu-search based heuristic is suggested which determines the hub location and vehicle routes simultaneously. Series of computational tests are then executed to evaluate the performance of valid inequalities and tabu-search based heuristic. The results show that using all valid inequalities improves the solution time of the pure proposed model. Meanwhile, the proposed heuristic works efficiently in finding good-quality solutions for the proposed hub location-routing model.
February 2018
Lot sizing and supplier selection with multiple items, multiple periods, quantity discounts, and backordering
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): Hesham K. Alfares, Rio Turnadi A general model is presented for a realistic multi-item lot-sizing problem with multiple suppliers, multiple time periods, quantity discounts, and backordering of shortages. Mixed integer programming (MIP) is used to formulate the problem and obtain the optimum solution for smaller problems. Due to the large number of variables and constraints, the model is too hard to solve optimally for practical problems. In order to tackle larger problem sizes, two heuristic solution methods are proposed. The first method is developed by modifying the Silver-Meal heuristic, and the second one by developing a problem-specific Genetic Algorithm (GA). Both heuristic methods are shown to be effective in solving the lot-sizing problem, but the GA is generally superior.
February 2018
A Branch-and-Price algorithm for a compressor scheduling problem
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): Marcelo Wuttig Friske, Luciana S. Buriol, Eduardo Camponogara This work presents a Branch-and-Price algorithm for solving a compressor scheduling problem with applications in oil production. The problem consists in defining a set of compressors to be installed for supplying the gas-lift demand of oil wells while minimizing the associated costs. Owing to the non-convex nature of the objective function, two piecewise-linear formulations are tested in the pricing subproblem, which is solved with a two-phase strategy. Also, two branching strategies are proposed based on the original problem variables, and a specific rule is created for solving the master problem as an integer program for obtaining feasible solutions. Experimental results are reported for three sets of instances, for which the branch-and-price algorithm obtained more optimal solutions, and spent less time on average than the CPLEX solver applied to the piecewise-linear formulation. Furthermore, for the solution of the largest instances within a limited computational time, the proposed branch-and-price algorithm found good feasible solutions, outperforming CPLEX.
February 2018
Optimizing the new coordinated replenishment and delivery model considering quantity discount and resource constraints
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): Rui Liu, Yu-Rong Zeng, Hui Qu, Lin Wang Under a global purchasing environment, more and more companies have realized that considerable cost savings can be achieved through a coordinated replenishment and delivery (CRD) strategy. A new and practical CRD model with quantity discount (D-CRD) and its extension with constraints (CD-CRD) are proposed. Several important properties of the proposed D-CRD and CD-CRD policies are presented. A heuristic based on these properties and a hybrid Tabu search algorithm is designed to obtain satisfactory solutions for D-CRD and CD-CRD. Computational results demonstrate the effectiveness and efficiency of the algorithms. Although D-CRD is more efficient than CRD, resource constraints significantly weaken the effects of quantity discount strategy, especially for large-scale problems. Moreover, constraints in the coordinated stage are more sensitive than constraints in the delivery stage.
February 2018
A linguistic solution for double large-scale group decision-making in E-commerce
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): Tong Wu, Xinwang Liu, Jindong Qin This paper develops a solution for solving large-scale attributes and decision-makers in double large-scale group decision-making problems. Linguistic principal component analysis is used to reduce the dimensions of the attributes and fuzzy equivalence clustering with linguistic information is used to aggregate the preferences of the decision-makers, respectively. Considering that people tend to give their direct preference information with linguistic variables, a codebook that used to model such language information with interval type-2 fuzzy sets is constructed. Numerical principal component analysis is extended into linguistic principal component analysis to reduce the dimensions of large-scale attributes under uncertainty situations. In addition, a linguistic aggregation operator is extended to aggregate decision information. The large-scale attributes and decision makers are classified by linguistic principal component analysis and fuzzy equivalence clustering with linguistic information respectively. Finally, the data that used to construct codebook and sample matrix of linguistic principal component analysis is obtained through questionnaire survey. The decision model is applied to the customer decision for E-commerce service to verify its feasibility and effectiveness.
February 2018
A two-stage consensus method for large-scale multi-attribute group decision making with an application to earthquake shelter selection
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): Yejun Xu, Xiaowei Wen, Wancheng Zhang In multi-attribute group decision making, it is preferable that a set of experts reach a high degree of consensus amongst their opinions, especially for large-scale group decision making. This paper presents a two-stage method to support the consensus reaching process for large-scale multi-attribute group decision making problems. The first stage classifies the large-scale group into several sub-clusters by utilizing the self-organizing maps and, then, an iterative algorithm is proposed to obtain the group preference for each sub-cluster. The second stage treats the group preference of each sub-cluster as the representative preference and collapses each sub-cluster to form a smaller and more manageable group. Then the aforesaid iterative algorithm is utilized to process the new set and select the best alternative(s). Finally, a case study of real application to earthquake shelter selection and comparative analysis are given to verify the effectiveness of the proposed method.
February 2018
Sustainable supply chains under government intervention with a real-world case study: An evolutionary game theoretic approach
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): Reza Mahmoudi, Morteza Rasti-Barzoki It is clear that the problem of global warming and greenhouse gas emissions is one of the most important and challenging issues in recent years. Governments have a key role in managing this crisis. They can influence the polluting activities of producers by enacting policies and applying incentives. In addition, government policies can also affect the production and competitive markets of industries. Applying too-strict policies can lead to significant reductions in producers’ profit, and even complete business closure. For the first time, this paper models the contrast between government objectives and producers' targets, using a two-population evolutionary game theory approach under different scenarios. Three different scenarios are considered for government. In the first scenario, the government imposes taxes and subsidies to maximize its profit with an upper bound for total environmental impacts. In the second one the government chooses tariffs that minimize total environmental impacts by considering a lower bound for its profit. Finally in the third scenario, the government makes a trade-off between its profit and environmental objectives by a linear combination in an objective function. Using two-population evolutionary game theory approach, the performance of supply chains members under different government scenarios is modeled. Finally, the proposed sustainable model is applied to the Indian textile industry. The results show that government policy clearly affects producers’ activity, competitive markets and emissions. Imposed tariffs are the most effective government approach to minimizing environmental impacts.
February 2018
Some new Hamacher aggregation operators under single-valued neutrosophic 2-tuple linguistic environment and their applications to multi-attribute group decision making
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): Qun Wu, Peng Wu, Ligang Zhou, Huayou Chen, Xianjun Guan This paper proposes an approach to linguistic multiple attribute group decision making (MAGDM) problem with single-valued neutrosophic 2-tuple linguistic (SVN2TL) assessment information by adding a subjective imprecise estimation of reliability of the 2-tuple linguistic terms (2TLTs). SVN2TL includes the truth-membership (TM), indeterminacy-membership (IM) and faulty-membership (FM), which can express the incomplete, indeterminate and inconsistent information perfectly and avoid information and precision losing in aggregation process ideally. We first propose the concept of SVN2TL set (SVN2TLS) and single valued neutrosophic 2-tuple linguistic element (SVN2TLE), basic operational rules on SVN2TLEs via Hamacher triangular norms, and ranking method for SVN2TLEs. Then, some SVN2TL aggregation operators including SVN2TL Hamacher weighted averaging (SVN2TLHWA) operator, SVN2TL Hamacher geometric weighted averaging (SVN2TLHGWA) operator, are developed, their some properties are investigated as well. Moreover, we apply new operators to develop approach to MAGDM problem with SVN2TL assessment information, where a model for optimal weighting vector is constructed. Finally, an numerical example related to evaluation of emergency response solutions for sustainable community development is provided to show the utility and effectiveness of the method described in this paper. A sensitivity and comparative analysis are also conducted to demonstrate the strength and practicality of the proposed method.
February 2018
Nominal features-based class specific learning model for fault diagnosis in industrial applications
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): R. Kannan, S. Solai Manohar, M. Senthil Kumaran Fault Detection and Isolation (FDI) is an initial stage of the real-time fault diagnosis in industrial applications. The increase in a number of faults will increase the size of the features that limit the accuracy. This paper proposes the suitable technique to extract the relevant features and classify the faults accordingly. Initially, the preprocessing removes the unfilled entries in the fault dataset (after the data collected from the sensor). Then, the Minimal Relevant Feature extraction predicts the features that correspond to the six types of fault classes. The minimum and maximum ranges of voltage, current, vibrations and speed due to the above classes regarded as the features. The modified objective function for class-specific support vector machine (CS-SVM) classifies the fault classes which highly contribute to the early diagnosis. The relevant feature prior to the Classification increases the accuracy effectively. The variations of voltage, current, and motor speed according to the injection of vibration faults from the motor bearings determine the impacts effectively. The comparison between the proposed Minimal Relevant Features-based Classification (MRFC) with existing SVM regarding the accuracy, precision, recall, sensitivity, specificity and coefficient (Jaccard, Dice and Kappa) confirms the effectiveness of proposed MRFC in earlier fault diagnosis in industrial applications.
Available online 17 January 2018
Closed-loop supply chain network design under disruption risks: A robust approach with real world application
Publication date: February 2018
Source:Computers & Industrial Engineering, Volume 116 Author(s): Armin Jabbarzadeh, Michael Haughton, Amir Khosrojerdi In today’s globalized and highly uncertain business environments, supply chains have become more vulnerable to disruptions. This paper presents a stochastic robust optimization model for the design of a closed-loop supply chain network that performs resiliently in the face of disruptions. The proposed model is capable of considering lateral transshipment as a reactive strategy to cope with operational and disruption risks. The objective is to determine facility location decisions and lateral transshipment quantities that minimize the total supply chain cost across different disruption scenarios. A Lagrangian relaxation algorithm is developed to solve the robust model efficiently. Important managerial insights are obtained from the model implementation in a case study of glass the industry.
Available online 12 January 2018
The impact of product returns on price and delivery time competition in online retailing
Publication date: Available online 17 January 2018
Source:Computers & Industrial Engineering Author(s): Sisi Zhao, Feng Wu, Tao Jia, Lei Shu This paper studies price and promised delivery lead time (PDL) competition between two e-tailers in the context of e-commerce. Product returns are considered, which are affected by late and early delivery inaccuracies, i.e., the delay and early arrival of random actual delivery time relative to PDL. The goal is to examine whether the Nash Equilibrium exists in the competition, and to explore the impact of product returns on equilibrium solutions. We consider two cases where the sensitivities of the return rate to late delivery inaccuracy and early delivery inaccuracy are symmetric/asymmetric. The results suggest that (1) firms with lower basic return rates or lower return rate sensitivities, always quote higher prices and shorter PDLs; (2) it is not always profitable for firms whose competitors’ return parameters increase.
Available online 9 January 2018
Dual population multi operators harmony search algorithm for dynamic optimization problems
Publication date: Available online 12 January 2018
Source:Computers & Industrial Engineering Author(s): Ayad Turky, Salwani Abdullah, Anas Dawod Dynamic optimization problems (DOPs) have been widely researched in recent years. This is due to its numerous practical applications in real-life conditions. To solve DOPs, the optimizer should be able to track the changes and simultaneously seek for global optima in the search space. This paper proposes a dual population multi operators harmony search algorithm for DOPs to deal with changes in the problem landscape. The main difference between the proposed algorithm and other techniques are twofold: dual population for exploring and exploiting the search space, and the use of multi operators at different points of the search. Extensive experiments were conducted on the Moving Peaks Benchmark (MPB) and six dynamic test functions proposed in the IEEE Congress on Evolutionary Computation (CEC 2009) were used to evaluate the performance of the proposed algorithm. Empirical results indicate the superiority of the proposed algorithm when compared to state-of-the-art algorithms from the literature.
Available online 2 January 2018
Single-machine group scheduling with new models of position-dependent processing times
Publication date: Available online 9 January 2018
Source:Computers & Industrial Engineering Author(s): Xin Zhang, Lijuan Liao, Wenya Zhang, T.C.E. Cheng, Yuanyuan Tan, Min Ji We propose new models of position-dependent processing times for single-machine group scheduling problems. The two objectives of the scheduling problems are to minimize the makespan and the total completion time, respectively. We present polynomial-time algorithms to solve the problems.
Available online 2 January 2018
An architecture based on RAMI 4.0 to discover equipment to process operations required by products
Publication date: Available online 2 January 2018
Source:Computers & Industrial Engineering Author(s): Marcos A. Pisching, Marcosiris A.O. Pessoa, Fabr
January 2018
A passive RFID tag-based locating and navigating approach for automated guided vehicle
Publication date: Available online 2 January 2018
Source:Computers & Industrial Engineering Author(s): Shaoping Lu, Chen Xu, Ray Y. Zhong, Lihui Wang This research is motivated by the industrial applications of Cyber-physical System (CPS) such as smart warehouse and intelligent manufacturing which require the support of AGV (Automated Guided Vehicle). One of the key research is locating and navigating approach that should be with much flexibility for dealing with complex industrial applications. This paper takes a passive radio frequency identification (RFID) tag-based locating and navigating approach for AGV to examine the influences of tags, antennas, and environmental aspects. The approach was validated and a prototype system was built up. Several key observations are significant from the implementation and simulation study. Firstly, it is observed that adding angle reflector could eliminate the back lobe and restrain backward reflection. Secondly, RFID antenna center and the tag center should be designed at the same height could reduce the impact area of reflection. Lessons and insights from this study will be significant for industrial practitioners to implement AGVs in smart warehouse and manufacturing management.
January 2018
Two-echelon vehicle routing problem with simultaneous pickup and delivery: Mathematical model and heuristic approach
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Onder Belgin, Ismail Karaoglan, Fulya Altiparmak The vehicle routing problem is one of the most important areas of logistics management. This study considers two-echelon vehicle routing problem with simultaneous pickup and delivery (2E-VRPSPD) which is a variant of vehicle routing problem. In the 2E-VRPSPD, the pickup and delivery activities are performed simultaneously by the same vehicles through depot to satellites in the first echelon and from satellites to customers in the second echelon. To solve the problem, firstly, a node-based mathematical model is proposed and three valid inequalities from the literature are adapted to strengthen the model. Because of the NP-hardness of the 2E-VRPSPD, secondly, a hybrid heuristic algorithm based on variable neighborhood descent (VND) and local search (LS), called VND_LS, is developed to solve medium- and large-size instances of the 2E-VRPSPD. We conduct an experimental study to investigate the effectiveness of the valid inequalities on the mathematical model and also to evaluate the effectiveness and efficiency of the VND_LS. Computational results show that valid inequalities have significant effect to strengthen the mathematical formulation. Furthermore, the VND_LS finds good solutions for the problem efficiently. Finally, we apply the VND_LS to compare single- and two-echelon distribution systems for a supermarket chain located in Turkey. The results indicate that the VND_LS can easily be applied for real-world problems.
January 2018
NDSC based methods for maximizing the lifespan of randomly deployed wireless sensor networks for infrastructures monitoring
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Yousif E.E. Ahmed, Kondo H. Adjallah, Romuald Stock, Imed Kacem, Sharief F. Babiker This paper addresses the problem of wireless sensor network (WSN) lifetime maximization, under limited available energy constraint. The investigations have shown that the valid amount of energy was not used by the existing disjoint set covers (DSC) based scheduling method for WSNs lifetime maximization, because of the DSC constraints. Instead, we suggest in this paper to schedule non-disjoint sets covers (NDSC) for maximizing WSNs lifetime. Thus, we have formulated this problem, using the integer linear programming (ILP) mathematical model, then we developed an approach based on genetic algorithm GA to find the maximal lifespan. As main contributions, we investigated and designed a new method using the NDSC instead of the DSC. This approach removes the latter’s constraint and gives the opportunity to a sensor to participate in more than one cover, and thereby improves significantly the WSNs’ lifetime. We proposed an exact method and a genetic algorithms (GA) for the NDSC efficient scheduling for the WSN lifetime maximization. The exact method lies on the integer linear programming (ILP). For the GA based heuristics, we used a specific arrangement of chromosomes combining several crossover and mutation strategies for encoding the solutions. We provided experimental results for different instances involving sensors with non-identical amount of initial energy and power consumption. In addition, we provided comparative analysis results between the solutions obtained by our both methods and the existing methods based on DSC. The comparisons of the run times and the solutions’ quality revealed the dominance of the solutions yielded by our methods based on the NDSC compared to those based on the DSC.
January 2018
A cooperative swarm intelligence algorithm based on quantum-inspired and rough sets for feature selection
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Djaafar Zouache, Fouad Ben Abdelaziz Feature selection is an important preprocessing step for classification as it improves the accuracy and overcomes the complexity of the classification process. However, in order to find a potentially optimal feature subset for the feature selection problem, it is necessary to design an efficient exploration approach that can explore an enormous number of possible feature subsets. It is also necessary to use a powerful evaluation approach to assess the relevance of these feature subsets. This paper presents a new cooperative swarm intelligence algorithm for feature selection based on quantum computation and a combination of Firefly Algorithm (FA) and Particle Swarm Optimization (PSO). Quantum computation ensures a good trade-off between the exploration and the exploitation of the search space while the combination of the FA and PSO enables an effective exploration of all the possible feature subsets. We use rough set theory to assess the relevance of the potential generated feature subsets. We tested the proposed algorithm on eleven UCI datasets and compared with a deterministic rough set reduction algorithms and other swarm intelligence algorithms. The experiment results show clearly that our algorithm provides a better rate of feature reduction and a high accuracy classification.
January 2018
A note on the Optimal Periodic Pattern (OPP) algorithm for the system in which buyers periodically order from a vendor
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Stanis
January 2018
Forecasting fault events for predictive maintenance using data-driven techniques and ARMA modeling
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Marcia Baptista, Shankar Sankararaman, Ivo. P. de Medeiros, Cairo Nascimento, Helmut Prendinger, Elsa M.P. Henriques Presently, time-based airline maintenance scheduling does not take fault predictions into account, but happens at fixed time-intervals. This may result in unnecessary maintenance interventions and also in situations where components are not taken out of service despite exceeding their designed risk of failure. To address this issue we propose a framework that can predict when a component/system will be at risk of failure in the future, and therefore, advise when maintenance actions should be taken. In order to facilitate such prediction, we employ an auto-regressive moving average (ARMA) model along with data-driven techniques, and compare the performance of multiple data-driven techniques. The ARMA model adds a new feature that is used within the data-driven model to give the final prediction. The novelty of our work is the integration of the ARMA methodology with data-driven techniques to predict fault events. This study reports on a real industrial case of unscheduled removals of a critical valve of the aircraft engine. Our results suggest that the support vector regression model can outperform the life usage model on the evaluation measures of sample standard deviation, median error, median absolute error, and percentage error. The generalized linear model provides an effective approach for predictive maintenance with comparable results to the baseline. The remaining data-driven models have a lower overall performance.
January 2018
A novel artificial bee colony algorithm based on the cosine similarity
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Wan-li Xiang, Yin-zhen Li, Rui-chun He, Ming-xia Gao, Mei-qing An Artificial bee colony (ABC) is a very popular and powerful optimization tool. However, there still exists an insufficiency of slow convergence in ABC. To further improve the convergence rate of ABC, a novel ABC (CosABC for short) is proposed based on the cosine similarity, which is employed to choose a better neighbor individual. Under the guidance of the chosen neighbor individual, a new solution search equation is introduced to reduce the weakness of undirected search of ABC. Furthermore, in the employed bees phase, a solution search equation with the guidance of global best individual is also integrated, and the frequency of parameters perturbation is also employed to further increase the information share between different individuals. In the onlooker bees phase, ABC/rand/1/ is used to enhance the exploitation ability, yet an opposition-based learning technique is also used to balance the exploitation of ABC/rand/1. All these modifications together with ABC form the proposed CosABC algorithm. To demonstrate the effectiveness of CosABC, a comprehensive experimental research is conducted on a test suite composed of twenty-four benchmark functions. What is more, it is further compared with a few state-of-the-art algorithms to validate the superiority of CosABC. The related comparison results show that CosABC is effective and competitive.
January 2018
A multi-echelon multi-product stochastic model to supply chain of small-and-medium enterprises in industrial clusters
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Vahid Kayvanfar, S.M. Moattar Husseini, Mohsen S. Sajadieh, B. Karimi An Industrial Cluster (IC) is a set of similar and interrelated firms in a specific field situated in a geographic concentration to share joint resources. To date, the interrelations between ICs and Supply Chain Management (SCM) have been improperly mathematically studied, despite their inherent relationship. In this paper, a Supply-Demand Hub in Industrial Clusters (SDHIC) as a specific common provider of warehousing and logistics activities managed by a third-party logistics provider (3PL) is proposed to minimize the total cost of the considered supply chain. Activities of businesses in IC is modeled thorough a two-stage stochastic programming model followed by an acceleration techniques for the Benders decomposition. The numerical experiments comprising sensitivity analysis are conducted over a case study to show the attractiveness of the proposed model. Some managerial insights are presented based on the obtained results.
January 2018
A knowledge-based measure of product complexity
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Xiaoqi Zhang, Vince Thomson This paper introduces a measure of product complexity from a knowledge perspective. A product is considered to be the result of integrating functions; so, the measure considers the complexity of individual functions as well as integration tasks. Disciplinary knowledge required in product design is classified and quantified. The complexity of each individual function is a measure of the intensity of knowledge requirements. The complexity of integrating two functions is a measure of the product of knowledge difference and interdependency. The application of the new method for the estimation of design effort and project duration is illustrated with an example of a hydroelectric generator.
January 2018
Efficient heuristics for the hybrid flow shop scheduling problem with missing operations
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Manuel Dios, Victor Fernandez-Viagas, Jose M. Framinan In this paper, we address the hybrid flowshop scheduling problem for makespan minimisation. More specifically, we are interested in the special case where there are missing operations, i.e. some stages are skipped, a condition inspired in a realistic problem found in a plastic manufacturer. The main contribution of our paper is twofold. On the one hand we carry out a computational analysis to study the hardness of the hybrid flowshop scheduling problem with missing operations as compared to the classical hybrid flowshop problem. On the other hand, we propose a set of heuristics that captures some special features of the missing operations and compare these algorithms with already existing heuristics for the classical hybrid flowshop, and for the hybrid flowshop problem with missing operations. The extensive computational experience carried out shows that our proposal outperforms existing methods for the problem, indicating that it is possible to improve the makespan by interacting with the jobs with missing operations.
January 2018
An evaluation of the bid price and nested network revenue management allocation methods
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Victor Pimentel, Aishajiang Aizezikali, Tim Baker We compare the revenue generating capabilities of the bid price allocation method and the nested network method in hotel revenue management. Revenue maximization is achieved by an optimal allocation of assets across market segments, subject to constraints such as overbooking limit and the cross-elasticity of competitors’ pricing. Using a simulation model of a large hotel’s reservation system, validated by Marriott hotels, we find that the nested network method outperforms the bid price method, and, on average, leads to an improvement of 6% in revenue in the worst-case scenarios across operating environments. This improvement is 3.6% when restricted to cases in which overbooking and allocation are performed simultaneously. In no operating environment is the improvement less than 2%. Since the bid price method is, by far, the most commonly used allocation method in practice, these results indicate that hotels should consider switching to the nested network method. This change is feasible because (1) most hotels already have in place the core optimization system required to execute the nested network method, and (2) the nested network method converges to optimality in less than two minutes for most realistically sized problems, as we demonstrate.
January 2018
A decision support algorithm for assessing the engagement of a demand response program in the industrial sector of the smart grid
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Omid Ameri Sianaki, Mohammad A.S. Masoum, Vidyasagar Potdar In the industrial sector of the smart grid (SG), a demand response program (DRP) is offered to consumers to motivate them to shift their demand for electricity to the off-peak period. DRP can cause a dilemma for industrial consumers when energy load is decreased since it may disrupt the production process and they may consequently incur losses. Hence, industrial units may choose to accept or reject a DRP. If they choose to engage in a DRP, they may use the available back-up on-site energy resources to access the required amount of energy. Hence, any decision about load curtailment requires a comprehensive assessment of all layers of production and operational management. This paper utilises several methodologies to evaluate the effects of DRP engagement on operational management. Firstly, the Delphi method is employed for extracting and identifying twenty-six criteria embedded in ten operational and production management factors. Secondly, based on these criteria, the production equipment is ranked using the TOPSIS method. This ranking shows which equipment will have less impact on the organisation’s profit as a result of participating in a DRP; but, it will not support production and energy planning which is affected by DRP engagement. So, thirdly, a linear programming (LP) model in a discrete scheduling time horizon is proposed which considers the TOPSIS method output and all the constraints imposed by the DRP and the production resources. Finally, based on the proposed methodology, a decision-making algorithm is designed to assist the operation and energy managers to decide whether to accept or reject the offer to engage in a DRP and if they decide to participate, how to best utilize the available distributed energy resources to regain the energy lost. The main contribution of this paper is the proposed methodology which combines the outcome of the Delphi and TOPSIS methods with a linear optimisation model, the effectiveness of which is clearly demonstrated by the sensitivity analysis.

Graphical abstract

image
January 2018
A reliable multi-period intermodal freight network expansion problem
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Fateme Fotuhi, Nathan Huynh This paper addresses the intermodal freight network expansion problem consisting of multiple periods. In each period, the objective is to determine the locations of new intermodal terminals, the amount of capacity to add to existing terminals, and the existing rail links to retrofit. The multi-period planning problem has the added complexity of determining which period a particular improvement should be made given a limited budget for each time period. A probabilistic robust mathematical model is proposed to address these decisions and uncertainties in the network. Due to the complexity of this model, a hybrid Simulated Annealing (SA) algorithm is proposed to solve the problem and its applicability is demonstrated via two numerical examples. Important managerial insights are drawn and discussed on the benefits of utilizing the multi-period approach.
January 2018
Jointly rostering, routing, and rerostering for home health care services: A harmony search approach with genetic, saturation, inheritance, and immigrant schemes
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Chun-Cheng Lin, Lun-Ping Hung, Wan-Yu Liu, Ming-Chun Tsai In home health care (HHC) services, nurses or professional caregivers are dispatched to patients’ homes to provide medical care services, such that each patient can stay at home to be treated periodically. The HHC problem consists of the nurse rostering problem (NRP) and the vehicle routing problem with time windows (VRPTW), both of which are NP-hard problems, which are harder or equal to the hardest problem in the NP (nondeterministic polynomial time) problem class and generally cannot be solved efficiently. To the best of our knowledge, the previous algorithmic approaches were designed to separately address NRP and VRPTW of the HHC problem. However, NRP and VRPTW of the HHC problem are intercorrelated, and their respective optimal objectives may be in conflict with each other in many cases. Additionally, the problem generally involves too many constraints to be solved, and most previous works did not address occurrence of sudden incidents in HHC services (e.g., a nurse or a patient suddenly requests for a leave, and a patient suddenly changes the time slot to be treated) such that the original nurse roster could become infeasible. Under constraints of nurse qualifications, working laws, nurse preferences, and vehicle routing, the first model considers nurse rostering and vehicle routing concurrently to minimize total costs of nurse overtime and vehicle routing. The second model extends the first model with rerostering caused by occurrence of sudden incidents. Harmony search algorithm (HSA) has been shown to perform better in solving NRPs than conventional metaheuristics, and hence this work proposes an improved HSA with genetic and saturation schemes, in which the solution representation is designed to concurrently determine nurse rostering and vehicle routing. For the second rerostering model, inheritance and immigrant schemes are added to the HSA to adapt to the change caused by occurrence of sudden incidents. Experimental results show that the proposed HSA performs well and can adapt to the change caused by sudden incidents.
January 2018
A fuzzy modeling approach to optimize control and decision making in conflict management in air traffic control
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Agnaldo Volpe Lovato, Cristiano Hora Fontes, Marcelo Embiru
January 2018
Unmanned aerial vehicle routing in the presence of threats
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Kamil A. Alotaibi, Jay M. Rosenberger, Stephen P. Mattingly, Raghavendra K. Punugu, Siriwat Visoldilokpun We study the routing of Unmanned Aerial Vehicles (UAVs) in the presence of the risk of enemy threats. The main goal is to find optimal routes that consider targets visited, threat exposure, and travel time. We formulate a mixed integer linear program that maximizes the total number of visited targets for multiple UAVs, while limiting both the route travel time for each UAV and the total threat exposure level for all UAVs to predetermined constant parameters. The formulation considers a set covering vehicle routing problem where the risk of threat exposure and the travel time are modeled for each edge in a vehicle routing network. To reduce threat exposure, waypoints are generated within the network so routes can avoid high-risk edges. We propose several waypoint generation methods. Using the candidate waypoints, the UAV routes are optimized with branch-and-cut-and-price (BCP) methodology. Minimum dependent set constraints and a simple path heuristic are used to improve the computational efficiency of the BCP algorithm. Computational results are presented, which show that the BCP algorithm performs best when the number of waypoints generated a priori is about half the number of targets.
January 2018
Reconfigurable machining process planning for part variety in new manufacturing paradigms: Definitions, models and framework
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Qing Xia, Alain Etienne, Jean-yves Dantan, Ali Siadat Conventional machining process planning approaches are inefficient to handle the process planning complexity induced by part variety. Reconfigurable process planning is a new process planning approach which has been well recognized as a key enabler for current manufacturing paradigms. However, in the literature, there is neither a comprehensive part variety representation model to support reconfigurable process planning nor a global solution framework to instruct the generation of the feasible process plans for a specific part variant. Therefore, this paper extends the concept of reconfigurable process planning to a concept of reconfigurable machining process planning which targets the process plan generation for a part family. A solution framework is developed for reconfigurable machining process planning. In this framework, a feature-based part variety model is proposed to represent a part family; A reconfigurable machining process plan is defined as a set of modular components which can be configured/reconfigured into the machining process plans for any part variant in the family; a novel configuration approach is proposed to generate the process plan components for a specific part variant while configuring this part variant from the family. The feasibility and effectiveness of the proposed framework and models are tested in a real case study.
January 2018
Minimizing the maximum lateness on a single machine with raw material constraints by branch-and-cut
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): P
January 2018
Who should determine energy efficiency level in a green cost-sharing supply chain with learning effect?
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Qiao Zhang, Wansheng Tang, Jianxiong Zhang The increasing environmental awareness and green demands drive channel members to jointly take efforts to improve energy efficiency level of products. Considering the cost learning effect, this work develops a differential game model where the retailer (leader) sets retail margin, the manufacturer (follower) determines wholesale price and they jointly invest in energy efficiency level. Two scenarios are considered, where the decision right of energy efficiency level is respectively held by manufacturer (ME) and retailer (RE). Main results indicate that the energy efficiency level is usually governed by the manufacturer rather than the retailer, unless the retailer has a large bargaining power. Besides, the retailer’s preference on the decision right is weakened by a larger cost learning effect or energy efficiency effectiveness, but improved by a greater investment cost. This work contributes to researches of the channel members’ preference on holding the decision right of energy efficiency level in the presence of cost learning effect, and provides important managerial insights on firms’ investment and pricing strategies.
January 2018
Picking efficiency and stock safety: A bi-objective storage assignment policy for temperature-sensitive products
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): R. Accorsi, G. Baruffaldi, R. Manzini The increasing attention of consumers to product quality and safety raises new challenges for logistics. Enhancing the operation efficiency and mutually ensuring the safety and quality of the handled products are key levers for logistics providers and other operators in temperature-sensitive supply chain. Handling the temperature conditions experienced by the inventory is valuable in warehouses and any other points along the supply chain where products pause for long periods. This paper proposes an original adaptive storage assignment policy for temperature-sensitive products, which enables to mutually manage both efficiency and stock safety goals. This policy is based on a bi-objective integer programming model and an original solving algorithm. We intend our policy for warehouses that handle temperature-sensitive products in presence of high demand and weather seasonality and strong inventory mix turn-over. To the best of authors’ knowledge, this is the first attempt to integrate into a storage assignment policy the issue of stock quality conservation, the optimization of the picking activities, and the management of weather and demand seasonality at the warehouse. A multi-scenario what-if analysis was applied to a 3PL warehouse of biomedical products to validate the policy and explore its insight in a real-world application. This policy autonomously balances the management of the inventory between the efficiency and stock safety levers and records savings of 12% of the picking travel time and up to 20% of the inventory safety. In conclusion, this policy assesses how the warehouse infrastructure can respond to the demand and weather seasonality in accordance with the efficiency and safety requirements.

Graphical abstract

image
January 2018
A progressive approach to joint monitoring of process parameters
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Raja Fawad Zafar, Tahir Mahmood, Nasir Abbas, Muhammad Riaz, Zawar Hussain Process monitoring is a continuous phenomenon and it needs careful attention for an improved quality of output. Location and dispersion parameters play a vital role in regulating every process and it requires a timely detection of any change in their stable behaviors. Nowadays, practitioners prefer a single charting setup that offers better ability to detect joint shifts in the process parameters. In this study, we propose a new parametric memory-type charting structure based on progressive mean under max statistic, namely Max-P chart, for the joint monitoring of location and dispersion parameters. Assuming normality of the quality characteristics of interest this study provides an extensive comparison between the proposed chart and some existing schemes for joint monitoring of process location and dispersion parameters. We use run length properties for the performance analysis of different schemes under investigation in this study. These properties include individual and overall measures (average run length, standard deviation run length, extra quadratic loss, relative average run length, and performance comparison index) for comparative analysis. The study findings reveal that the newly proposed Max-P monitoring scheme offers relatively better performance to detect shifts in the process parameter(s). A real-life application of is also included from electrical engineering where the monitoring of the voltage of photovoltaic system is desired. The proposed scheme also offers better detection ability to identify special causes in the parameters of electrical process.
January 2018
A powerful discriminative approach for selecting the most efficient unit in DEA
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Mehdi Toloo, Maziar Salahi Data envelopment analysis (DEA) is a mathematical approach deals with the performance evaluation problem. Traditional DEA models partition the set of units into two distinct sets: efficient and inefficient. These models fail to get more information about efficient units whereas there are some applications, known as selection-based problems, where the concern is selecting only a single efficient unit. To address the problem, several mixed integer linear/nonlinear programming models are developed in the literature using DEA. The aim of all these approaches is formulating a model with more discriminating power. This paper presents a new nonlinear mixed integer programming model with significantly higher discriminating power than the existing ones in the literature. The suggested model lets the efficiency score of only a single unit be strictly greater than one. It is observed that the discrimination power of the model is high enough for fully ranking all units. More importantly, a linearization technique is used to formulate an equivalent mixed integer linear programming model which significantly decreases the computational burden. Finally, to validate the proposed model and also compare with some recent approaches, two numerical examples are utilized from the literature. Our founding points out the superiority of our model over all the previously suggested models from both theoretical and practical standpoints.
January 2018
A multi-objective optimization model of component selection in enterprise information system integration
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Lifeng Mu, C.K. Kwong Integrating legacy IT assets and new commercial software components together into a flexible IT architecture is one of open challenges facing modern enterprises today. Most of previous studies focused on using re-engineering to improve the flexibility of IT architectures, rather than employing optimization theory in architecture design problem, especially the problem of component selection and re-allocation in IT architecture. Moreover, a scant amount of literature is available on considering the architectural flexibility and integration cost simultaneously. To fill in this gap, based on a modified quantitative method of measuring the relationship between couplings and cohesions in architecture, we devise a nonlinear multi-objective binary integer programming to select components from legacy candidates and commercial candidates, and to group them into services under the service-oriented architecture (SOA) environment. The customized SPEA2 algorithm is further used to solve the problem, and some managerial insights are provided based on experiments and sensitivity analysis with the model.
January 2018
Scheduling assemble-to-order systems with multiple cells to minimize costs and tardy deliveries
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Alex J. Ruiz-Torres, Giuseppe Paletta, Farzad Mahmoodi, Jose H. Ablanedo-Rosas Responsive assemble-to-order supply chains demand efficient coordination between production and distribution functions. This paper investigates the case of customer orders assigned to configuration cells located in different geographical regions. Complete orders are directly shipped from the cells to customers located in distinct locations. The objective is to minimize the total cost of production and transportation, and the percentage of tardy deliveries simultaneously. The problem is formulated as a bi-objective optimization problem. Four heuristics are developed for generating a set of Pareto solutions. Extensive experiments ranging from small to large-scale instances are performed. Results show the heuristics generate good, feasible non-dominant solutions, which are especially critical for decision makers. Our results also demonstrate that there is no clear dominant heuristic and that relative heuristic performance is dependent on problem size. Sensitivity analysis of critical parameters reveals additional insights.
January 2018
A novel two-level optimization approach for clustered vehicle routing problem
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Petrica C. Pop, Levente Fuksz, Andrei Horvat Marc, Cosmin Sabo In this paper, we are addressing the clustered vehicle routing problem (CluVRP) which is a variant of the classical capacitated vehicle routing problem (CVRP). The following are the main characteristics of this problem: the vertices of the graph are partitioned into a given number of clusters and we are looking for a minimum-cost collection of routes starting and ending at the depot, visiting all the vertices exactly once, except the depot, and with the additional constraint that once a vehicle enters a cluster it visits all the vertices within the cluster before leaving it. We describe a novel two-level optimization approach for CluVRP obtained by decomposing the problem into two logical and natural smaller subproblems: an upper-level (global) subproblem and a lower-level (local) subproblem, and solving them separately. The goal of the first subproblem is to determine the (global) routes visiting the clusters using a genetic algorithm, while the goal of the second subproblem is, to determine for the above mentioned routes, the visiting order within the clusters. The second subproblem is solved by transforming each global route into a traveling salesman problem (TSP) which then is optimally computed using the Concorde TSP solver. Extensive computational results are reported and discussed for an often used set of benchmark instances. The obtained results show an improvement of the quality of the achieved solutions and prove the efficiency of our approach as compared to the existing methods from the literature.
January 2018
Big data analytics in supply chain management between 2010 and 2016: Insights to industries
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Sunil Tiwari, H.M. Wee, Yosef Daryanto This paper investigates big data analytics research and application in supply chain management between 2010 and 2016 and provides insights to industries. In recent years, the amount of data produced from end-to-end supply chain management practices has increased exponentially. Moreover, in current competitive environment supply chain professionals are struggling in handling the huge data. They are surveying new techniques to investigate how data are produced, captured, organized and analyzed to give valuable insights to industries. Big Data analytics is one of the best techniques which can help them in overcoming their problem. Realizing the promising benefits of big data analytics in the supply chain has motivated us to write a review on the importance/impact of big data analytics and its application in supply chain management. First, we discuss big data analytics individually, and then we discuss the role of big data analytics in supply chain management (supply chain analytics). Current research and application are also explored. Finally, we outline the insights to industries. Observations and insights from this paper could provide the guideline for academia and practitioners in implementing big data analytics in different aspects of supply chain management.
January 2018
Algorithms for the bin packing problem with overlapping items
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Aristide Grange, Imed Kacem, S
January 2018
A proactive approach to solve integrated production scheduling and maintenance planning problem in flow shops
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Weiwei Cui, Zhiqiang Lu, Chen Li, Xiaole Han This paper deals with the integration of production scheduling and maintenance planning in order to optimize the bi-objective of quality robustness and solution robustness for flow shops with failure uncertainty. First, a proactive model is proposed to formulate the problem mathematically. Then, Monte Carlo sampling method is adopted to obtain the objective value for feasible solutions and a surrogate measure is proposed to approximate the objective function efficiently. Based on the sampling method and surrogate measure, a two-loop algorithm is devised to optimize the sequence of jobs, positions of preventive maintenances and idle times simultaneously. Computational results indicate that solution robustness and stability of quality robustness can be significantly improved using our algorithm compared with the solutions obtained by the traditional way.
January 2018
Developing a CCHP-microgrid operation decision model under uncertainty
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Carlos Marino, Mohammad Marufuzzaman, Mengqi Hu, M.D. Sarder In this study, we present a collaborative decision model to study the energy exchange among building clusters where the buildings share a combined cooling, heating and power system, thermal storage, and battery, and each building aims to minimize its energy consumption cost under electricity demand uncertainty. The problem is formulated as a two-stage stochastic programming model and then solved using a hybrid decomposition algorithm that combines Sample Average Approximation algorithm with an enhanced Benders decomposition algorithm. Numerical experiments reveal that the hybrid decomposition algorithm provides high quality feasible solution in solving the realistic large-scale building cluster problem in a reasonable amount of time. Experimental results allow the investors to decide the optimal sizing of thermal and battery storage and PGU capacities under power demand uncertainty. Further, the model can assist decision makers to choose the appropriate pricing mechanism (i.e., an optimal pricing plan) under different electricity demand variability level. Finally, we observe that the CCHP-microgrid system is more sensitive to an increase in heating demand than the cooling demand.
January 2018
Development of intuitionistic fuzzy super-efficiency slack based measure with an application to health sector
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Alka Arya, Shiv Prasad Yadav Data envelopment analysis (DEA) is a linear programming based technique, which determines the performance efficiencies of homogeneous decision making units (DMUs). Slack based measure (SBM) model finds the performance efficiency, and it deals with the input excesses and output shortfalls of DMUs. In conventional SBM, the data is crisp. But it fluctuates in the real world applications. Such data can take the form of fuzzy/intuitionistic fuzzy (IF) number. In this paper, we propose an IF slack based measure (IFSBM) model to determine the efficiency of DMUs and IF super efficiency SBM (IFSESBM) model to determine the efficiency of efficient DMUs for
January 2018
Optimal scheduling of manufacturing and onsite generation systems in over-generation mitigation oriented electricity demand response program
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Md Monirul Islam, Xiao Zhong, Haoyi Xiong, Zeyi Sun Manufacturing system is considered a valuable source that can provide electricity load adjustment in electricity demand response program to balance the supply and demand of the electricity throughout the grid. In this paper, we propose a mathematical model to identify the optimal participation strategy for manufacturing end use customers with onsite energy generation system in the demand response program designed for mitigating electricity over-generation due to high penetration of renewable sources in electricity grid. The background of over-generation mitigation oriented demand response program is described first. Then, the manufacturer’s decision making procedure for identifying the optimal participation strategy is modeled as a mixed nonlinear integer programming. In particular, the manufacturers’ participation strategies including the decision of participating or not, and corresponding production schedule of manufacturing system as well as utilization schedule of onsite generation system, are modeled as decision variables in the objective function to minimize the overall cost considering the benefits due to the participation, energy billing cost, onsite generation cost, and production loss penalty cost. Particle swarm optimization is used to find a near optimal solution for the formulated problem. A numerical case study with sensitivity analysis is then conducted to demonstrate the effectiveness and robustness of the proposed model.

Evolutionary algorithms for solving the airline crew pairing problem
Publication date: January 2018
Source:Computers & Industrial Engineering, Volume 115 Author(s): Muhammet Deveci, Nihan
view: 171

(CA) Dynamite Studio - Join the Loyalty Program and Receive 30% off 1 Regular Price Item on Your Birthday

Start: 17 Aug 2017 | End: 01 May 2018

10% pour tout les produits

Code: AFF-FR-10POURCENT

Start: 27 Sep 2017 | End: 31 Mar 2018

(US) Dynamite Studio - Join the Loyalty Program and Receive 30% off 1 Regular Price Item on Your Birthday!

Start: 17 Aug 2017 | End: 01 May 2018

Search All Amazon* UK* DE* FR* JP* CA* CN* IT* ES* IN* BR* MX
Booking.com B.V. is based in Amsterdam in the Netherlands. Ready for some statistics? Our 1,534,024 properties, including 860,482 holiday rentals, are located in 123,105 destinations in 229 countries and territories, and are supported internationally by 198 offices in 70 countries.
2013 Copyright © Techhap.com Mobile version 2015 | PeterLife & company
Terms of use Link at is mandatory if site materials are using fully or particulary.
Were treated to the site administrator, a cup of coffee *https://paypal.me/peterlife
Yandex.ru