International Science Index
International Journal of Computer and Systems Engineering
Two-Stage Approach for Solving the Multi-Objective Optimization Problem on Combinatorial Configurations
The statement of the multi-objective optimization problem on combinatorial configurations is formulated, and the approach to its solution is proposed. The problem is of interest as a combinatorial optimization one with many criteria, which is a model of many applied tasks. The approach to solving the multi-objective optimization problem on combinatorial configurations consists of two stages; the first is the reduction of the multi-objective problem to the single criterion based on existing multi-objective optimization methods, the second stage solves the directly replaced single criterion combinatorial optimization problem by the horizontal combinatorial method. This approach provides the optimal solution to the multi-objective optimization problem on combinatorial configurations, taking into account additional restrictions for a finite number of steps.
Reinforcement Learning-Based Coexistence Interference Management in Wireless Body Area Networks
Current trends in remote health monitoring to monetize on the Internet of Things applications have been raised in efficient and interference free communications in Wireless Body Area Network (WBAN) scenario. Co-existence interference in WBANs have aggravates the over-congested radio bands, thereby requiring efficient Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) strategies and improve interference management. Existing solutions utilize simplistic heuristics to approach interference problems. The scope of this research article is to investigate reinforcement learning for efficient interference management under co-existing scenarios with an emphasis on homogenous interferences. The aim of this paper is to suggest a smart CSMA/CA mechanism based on reinforcement learning called QIM-MAC that effectively uses sense slots with minimal interference. Simulation results are analyzed based on scenarios which show that the proposed approach maximized Average Network Throughput and Packet Delivery Ratio and minimized Packet Loss Ratio, Energy Consumption and Average Delay.
Rank-Based Chain-Mode Ensemble for Binary Classification
In the field of machine learning, the ensemble has been employed as a common methodology to improve the performance upon multiple base classifiers. However, the true predictions are often canceled out by the false ones during consensus due to a phenomenon called “curse of correlation” which is represented as the strong interferences among the predictions produced by the base classifiers. In addition, the existing practices are still not able to effectively mitigate the problem of imbalanced classification. Based on the analysis on our experiment results, we conclude that the two problems are caused by some inherent deficiencies in the approach of consensus. Therefore, we create an enhanced ensemble algorithm which adopts a designed rank-based chain-mode consensus to overcome the two problems. In order to evaluate the proposed ensemble algorithm, we employ a well-known benchmark data set NSL-KDD (the improved version of dataset KDDCup99 produced by University of New Brunswick) to make comparisons between the proposed and 8 common ensemble algorithms. Particularly, each compared ensemble classifier uses the same 22 base classifiers, so that the differences in terms of the improvements toward the accuracy and reliability upon the base classifiers can be truly revealed. As a result, the proposed rank-based chain-mode consensus is proved to be a more effective ensemble solution than the traditional consensus approach, which outperforms the 8 ensemble algorithms by 20% on almost all compared metrices which include accuracy, precision, recall, F1-score and area under receiver operating characteristic curve.
Dependability Tools in Multi-Agent Support for Failures Analysis of Computer Networks
During their activity, all systems must be operational without failures and in this context, the dependability concept is essential avoiding disruption of their function. As computer networks are systems with the same requirements of dependability, this article deals with an analysis of failures for a computer network. The proposed approach integrates specific tools of the plat-form KB3, usually applied in dependability studies of industrial systems. The methodology is supported by a multi-agent system formed by six agents grouped in three meta agents, dealing with two levels. The first level concerns a modeling step through a conceptual agent and a generating agent. The conceptual agent is dedicated to the building of the knowledge base from the system specifications written in the FIGARO language. The generating agent allows producing automatically both the structural model and a dependability model of the system. The second level, the simulation, shows the effects of the failures of the system through a simulation agent. The approach validation is obtained by its application on a specific computer network, giving an analysis of failures through their effects for the considered network.
Segmentation of Arabic Handwritten Numeral Strings Based on Watershed Approach
Arabic offline handwriting recognition systems are considered as one of the most challenging topics. Arabic Handwritten Numeral Strings are used to automate systems that deal with numbers such as postal code, banking account numbers and numbers on car plates. Segmentation of connected numerals is the main bottleneck in the handwritten numeral recognition system. This is in turn can increase the speed and efficiency of the recognition system. In this paper, we proposed algorithms for automatic segmentation and feature extraction of Arabic handwritten numeral strings based on Watershed approach. The algorithms have been designed and implemented to achieve the main goal of segmenting and extracting the string of numeral digits written by hand especially in a courtesy amount of bank checks. The segmentation algorithm partitions the string into multiple regions that can be associated with the properties of one or more criteria. The numeral extraction algorithm extracts the numeral string digits into separated individual digit. Both algorithms for segmentation and feature extraction have been tested successfully and efficiently for all types of numerals.
Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm
Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.
Measuring Banks’ Antifragility via Fuzzy Logic
Analysing the world banking sector, we realize that traditional risk measurement methodologies no longer reflect the actual scenario with uncertainty and leave out events that can change the dynamics of markets. Considering this, regulators and financial institutions began to search more realistic models. The aim is to include external influences and interdependencies between agents, to describe and measure the operationalization of these complex systems and their risks in a more coherent and credible way. Within this context, X-Events are more frequent than assumed and, with uncertainties and constant changes, the concept of antifragility starts to gain great prominence in comparison to others methodologies of risk management. It is very useful to analyse whether a system succumbs (fragile), resists (robust) or gets benefits (antifragile) from disorder and stress. Thus, this work proposes the creation of the Banking Antifragility Index (BAI), which is based on the calculation of a triangular fuzzy number – to "quantify" qualitative criteria linked to antifragility.
Identifying Missing Component in the Bechdel Test Using Principal Component Analysis Method
A lot has been said and discussed regarding the rationale and significance of the Bechdel Score. It became a digital sensation in 2013, when Swedish cinemas began to showcase the Bechdel test score of a film alongside its rating. The test has drawn criticism from experts and the film fraternity regarding its use to rate the female presence in a movie. The pundits believe that the score is too simplified and the underlying criteria of a film to pass the test must include 1) at least two women, 2) who have at least one dialogue, 3) about something other than a man, is egregious. In this research, we have considered a few more parameters which highlight how we represent females in film, like the number of female dialogues in a movie, dialogue genre, and part of speech tags in the dialogue. The parameters were missing in the existing criteria to calculate the Bechdel score. The research aims to analyze 342 movies scripts to test a hypothesis if these extra parameters, above with the current Bechdel criteria, are significant in calculating the female representation score. The result of the Principal Component Analysis method concludes that the female dialogue content is a key component and should be considered while measuring the representation of women in a work of fiction.
Improvement of Central Composite Design in Modeling and Optimization of Simulation Experiments
Simulation modeling can be used to solve real world problems. It provides an understanding of a complex system. To develop a simplified model of process simulation, a suitable experimental design is required to be able to capture surface characteristics. This paper presents the experimental design and algorithm used to model the process simulation for optimization problem. The CO2 liquefaction based on external refrigeration with two refrigeration circuits was used as a simulation case study. Latin Hypercube Sampling (LHS) was purposed to combine with existing Central Composite Design (CCD) samples to improve the performance of CCD in generating the second order model of the system. The second order model was then used as the objective function of the optimization problem. The results showed that adding LHS samples to CCD samples can help capture surface curvature characteristics. Suitable number of LHS sample points should be considered in order to get an accurate nonlinear model with minimum number of simulation experiments.
Discovering Semantic Links Between Synonyms, Hyponyms and Hypernyms
This proposal aims for semantic enrichment between
glossaries using the Simple Knowledge Organization System (SKOS)
vocabulary to discover synonyms, hyponyms and hyperonyms
semiautomatically, in Brazilian Portuguese, generating new semantic
relationships based on WordNet. To evaluate the quality of this
proposed model, experiments were performed by the use of two sets
containing new relations, being one generated automatically and the
other manually mapped by the domain expert. The applied evaluation
metrics were precision, recall, f-score, and confidence interval. The
results obtained demonstrate that the applied method in the field of
Oil Production and Extraction (E&P) is effective, which suggests that
it can be used to improve the quality of terminological mappings.
The procedure, although adding complexity in its elaboration, can be
reproduced in others domains.
Deep Learning Based Fall Detection Using Simplified Human Posture
Falls are one of the major causes of injury and death
among elderly people aged 65 and above. A support system to
identify such kind of abnormal activities have become extremely
important with the increase in ageing population. Pose estimation
is a challenging task and to add more to this, it is even more
challenging when pose estimations are performed on challenging
poses that may occur during fall. Location of the body provides a
clue where the person is at the time of fall. This paper presents
a vision-based tracking strategy where available joints are grouped
into three different feature points depending upon the section they are
located in the body. The three feature points derived from different
joints combinations represents the upper region or head region,
mid-region or torso and lower region or leg region. Tracking is always
challenging when a motion is involved. Hence the idea is to locate
the regions in the body in every frame and consider it as the tracking
strategy. Grouping these joints can be beneficial to achieve a stable
region for tracking. The location of the body parts provides a crucial
information to distinguish normal activities from falls.
Use of Hierarchical Temporal Memory Algorithm in Heart Attack Detection
In order to reduce the number of deaths due to heart
problems, we propose the use of Hierarchical Temporal Memory
Algorithm (HTM) which is a real time anomaly detection algorithm.
HTM is a cortical learning algorithm based on neocortex used for
anomaly detection. In other words, it is based on a conceptual theory
of how the human brain can work. It is powerful in predicting unusual
patterns, anomaly detection and classification. In this paper, HTM
have been implemented and tested on ECG datasets in order to detect
cardiac anomalies. Experiments showed good performance in terms
of specificity, sensitivity and execution time.
Development of Tools for Multi Vehicles Simulation with Robot Operating System and ArduPilot
One of the main difficulties in developing multi-robot
systems (MRS) is related to the simulation and testing tools available.
Indeed, if the differences between simulations and real robots are
too significant, the transition from the simulation to the robot
won’t be possible without another long development phase and
won’t permit to validate the simulation. Moreover, the testing of
different algorithmic solutions or modifications of robots requires
a strong knowledge of current tools and a significant development
time. Therefore, the availability of tools for MRS, mainly with
flying drones, is crucial to enable the industrial emergence of these
systems. This research aims to present the most commonly used
tools for MRS simulations and their main shortcomings and presents
complementary tools to improve the productivity of designers in the
development of multi-vehicle solutions focused on a fast learning
curve and rapid transition from simulations to real usage. The
proposed contributions are based on existing open source tools as
Gazebo simulator combined with ROS (Robot Operating System) and
the open-source multi-platform autopilot ArduPilot to bring them to
a broad audience.
Localization of Geospatial Events and Hoax Prediction in the UFO Database
Unidentified Flying Objects (UFOs) have been an interesting topic for most enthusiasts and hence people all over the United States report such findings online at the National UFO Report Center (NUFORC). Some of these reports are a hoax and among those that seem legitimate, our task is not to establish that these events confirm that they indeed are events related to flying objects from aliens in outer space. Rather, we intend to identify if the report was a hoax as was identified by the UFO database team with their existing curation criterion. However, the database provides a wealth of information that can be exploited to provide various analyses and insights such as social reporting, identifying real-time spatial events and much more. We perform analysis to localize these time-series geospatial events and correlate with known real-time events. This paper does not confirm any legitimacy of alien activity, but rather attempts to gather information from likely legitimate reports of UFOs by studying the online reports. These events happen in geospatial clusters and also are time-based. We look at cluster density and data visualization to search the space of various cluster realizations to decide best probable clusters that provide us information about the proximity of such activity. A random forest classifier is also presented that is used to identify true events and hoax events, using the best possible features available such as region, week, time-period and duration. Lastly, we show the performance of the scheme on various days and correlate with real-time events where one of the UFO reports strongly correlates to a missile test conducted in the United States.
Low-Complexity Channel Estimation Algorithm for MIMO-OFDM Systems
One of the main challenges in MIMO-OFDM system
to achieve the expected performances in terms of data rate
and robustness against multi-path fading channels is the channel
estimation. Several methods were proposed in the literature based on
either least square (LS) or minimum mean squared error (MMSE)
estimators. These methods present high implementation complexity
as they require the inversion of large matrices. In order to overcome
this problem and to reduce the complexity, this paper presents
a solution that benefits from the use of the STBC encoder and
transforms the channel estimation process into a set of simple
linear operations. The proposed method is evaluated via simulation
in AWGN-Rayleigh fading channel. Simulation results show a
maximum reduction of 6.85% of the bit error rate (BER) compared to
the one obtained with the ideal case where the receiver has a perfect
knowledge of the channel.
Distributed Cost-Based Scheduling in Cloud Computing Environment
Cloud computing can be defined as one of the prominent technologies that lets a user change, configure and access the services online. it can be said that this is a prototype of computing that helps in saving cost and time of a user practically the use of cloud computing can be found in various fields like education, health, banking etc. Cloud computing is an internet dependent technology thus it is the major responsibility of Cloud Service Providers(CSPs) to care of data stored by user at data centers. Scheduling in cloud computing environment plays a vital role as to achieve maximum utilization and user satisfaction cloud providers need to schedule resources effectively. Job scheduling for cloud computing is analyzed in the following work. To complete, recreate the task calculation, and conveyed scheduling methods CloudSim3.0.3 is utilized. This research work discusses the job scheduling for circulated processing condition also by exploring on this issue we find it works with minimum time and less cost. In this work two load balancing techniques have been employed: ‘Throttled stack adjustment policy’ and ‘Active VM load balancing policy’ with two brokerage services ‘Advanced Response Time’ and ‘Reconfigure Dynamically’ to evaluate the VM_Cost, DC_Cost, Response Time, and Data Processing Time. The proposed techniques are compared with Round Robin scheduling policy.
A Real Time Ultra-Wideband Location System for Smart Healthcare
Driven by the demand of intelligent monitoring in
rehabilitation centers or hospitals, a high accuracy real-time location
system based on UWB (ultra-wideband) technology was proposed.
The system measures precise location of a specific person, traces his
movement and visualizes his trajectory on the screen for doctors or
administrators. Therefore, doctors could view the position of the
patient at any time and find them immediately and exactly when
something emergent happens. In our design process, different
algorithms were discussed, and their errors were analyzed. In addition,
we discussed about a , simple but effective way of correcting the
antenna delay error, which turned out to be effective. By choosing the
best algorithm and correcting errors with corresponding methods, the
system attained a good accuracy. Experiments indicated that the
ranging error of the system is lower than 7 cm, the locating error is
lower than 20 cm, and the refresh rate exceeds 5 times per second. In
future works, by embedding the system in wearable IoT (Internet of
Things) devices, it could provide not only physical parameters, but
also the activity status of the patient, which would help doctors a lot in
Agile Software Development Implementation in Developing a Diet Tracker Mobile Application
Technology era drives people to use mobile phone to support their daily life activities. Technology development has a rapid phase which pushes the IT company to adjust any technology changes in order to fulfill customer’s satisfaction. As a result of that, many companies in the USA emerged from systematics software development approach to agile software development approach in developing systems and applications to develop many mobile phone applications in a short phase to fulfill user’s needs. As a systematic approach is considered as time consuming, costly, and too risky, agile software development has become a more popular approach to use for developing software including mobile applications. This paper reflects a short-term project to develop a diet tracker mobile application using agile software development that focused on applying scrum framework in the development process.
Modeling of Water Erosion in the M'Goun Watershed Using OpenGIS Software
Water erosion is the major cause of the erosion that shapes the earth's surface. Modeling water erosion requires the use of software and GIS programs, commercial or closed source. The very high prices for commercial GIS licenses, motivates users and researchers to find open source software as relevant and applicable as the proprietary GIS. The objective of this study is the modeling of water erosion and the hydrogeological and morphophysical characterization of the Oued M'Goun watershed (southern flank of the Central High Atlas) developed by free programs of GIS. The very pertinent results are obtained by executing tasks and algorithms in a simple and easy way. Thus, the various geoscientific and geostatistical analyzes of a digital elevation model (SRTM 30 m resolution) and their combination with the treatments and interpretation of satellite imagery information allowed us to characterize the region studied and to map the area most vulnerable to water erosion.
A Survey on MAC Protocols for Vehicular Ad-Hoc Networks
Vehicular Ad-hoc Network (VANET) is an emerging and very promising technology that has great demand on the access capability of the existing wireless technology. VANETs help improve trafﬁc safety and efficiency. Each vehicle can exchange their information to inform the other vehicles about the current status of the traffic ﬂow or a dangerous situation such as an accident. To achieve these, a reliable and efficient Medium Access Control (MAC) protocol with minimal transmission collisions is required. High speed nodes, absence of infrastructure, variations in topology and their QoS requirements makes it difficult for designing a MAC protocol in vehicular networks. There are several MAC protocols proposed for VANETs to ensure that all the vehicles could send safety messages without collisions by reducing the end-to-end delay and packet loss ratio. This paper gives an overview of the several proposed MAC protocols for VANETs along with their benefits and limitations and presents an overall classification based on their characteristics.
A Comparative Analysis Approach Based on Fuzzy AHP, TOPSIS and PROMETHEE for the Selection Problem of GSCM Solutions
Sustainable economic growth is nowadays driving firms to extend toward the adoption of many green supply chain management (GSCM) solutions. However, the evaluation and selection of these solutions is a matter of concern that needs very serious decisions, involving complexity owing to the presence of various associated factors. To resolve this problem, a comparative analysis approach based on multi-criteria decision-making methods is proposed for adequate evaluation of sustainable supply chain management solutions. In the present paper, we propose an integrated decision-making model based on FAHP (Fuzzy Analytic Hierarchy Process), TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) and PROMETHEE (Preference Ranking Organisation METHod for Enrichment Evaluations) to contribute to a better understanding and development of new sustainable strategies for industrial organizations. Due to the varied importance of the selected criteria, FAHP is used to identify the evaluation criteria and assign the importance weights for each criterion, while TOPSIS and PROMETHEE methods employ these weighted criteria as inputs to evaluate and rank the alternatives. The main objective is to provide a comparative analysis based on TOPSIS and PROMETHEE processes to help make sound and reasoned decisions related to the selection problem of GSCM solution.
Usability in E-Commerce Websites: Results of Eye Tracking Evaluations
Usability is one of the most important quality attributes for web-based information systems. Specifically, for e-commerce applications, usability becomes more prominent. In this study, we aimed to explore the features that experienced users seek in e-commerce applications. We used eye tracking method in evaluations. Eye movement data are obtained from the eye-tracking method and analyzed based on task completion time, number of fixations, as well as heat map and gaze plot measures. The results of the analysis show that the eye movements of participants' are too static in certain areas and their areas of interest are scattered in many different places. It has been determined that this causes users to fail to complete their transactions. According to the findings, we outlined the issues to improve the usability of e-commerce websites. Then we propose solutions to identify the issues. In this way, it is expected that e-commerce sites will be developed which will make experienced users more satisfied.
Model Predictive Control Using Thermal Inputs for Crystal Growth Dynamics
Recently, crystal growth technologies have made
progress by the requirement for the high quality of crystal materials.
To control the crystal growth dynamics actively by external forces
is useuful for reducing composition non-uniformity. In this study,
a control method based on model predictive control using thermal
inputs is proposed for crystal growth dynamics of semiconductor
materials. The control system of crystal growth dynamics considered
here is governed by the continuity, momentum, energy, and mass
transport equations. To establish the control method for such thermal
fluid systems, we adopt model predictive control known as a kind
of optimal feedback control in which the control performance over
a finite future is optimized with a performance index that has a
moving initial time and terminal time. The objective of this study
is to establish a model predictive control method for crystal growth
dynamics of semiconductor materials.
Metamodel for Artefacts in Service Engineering Analysis and Design
As a process of developing a service system, the term ‘service engineering’ evolves in scope and definition. To achieve an integrated understanding of the process, a general framework and an ontology are required. This paper extends a previously built service engineering framework by exploring metamodels for the framework artefacts based on a foundational ontology and a metamodel landscape. The first part of this paper presents a correlation map between the proposed framework with the ontology as a form of evaluation for the conceptual coverage of the framework. The mapping also serves to characterize the artefacts to be produced for each activity in the framework. The second part describes potential metamodels to be used, from the metamodel landscape, as alternative formats of the framework artefacts. The results suggest that the framework sufficiently covers the ontological concepts, both from general service context and software service context. The metamodel exploration enriches the suggested artefact format from the original eighteen formats to thirty metamodel alternatives.
Cloud Enterprise Application Provider Selection Model for the Small and Medium Enterprise: A Pilot Study
Enterprise Applications (EAs) aid the organizations achieve operational excellence and competitive advantage. Over time, most Small and Medium Enterprises (SMEs), which are known to be the major drivers of most thriving global economies, use the costly on-premise versions of these applications thereby making business difficult to competitively thrive in the same market environment with their large enterprise counterparts. The advent of cloud computing presents the SMEs an affordable offer and great opportunities as such EAs can be cloud-hosted and rented on a pay-per-use basis which does not require huge initial capital. However, as there are numerous Cloud Service Providers (CSPs) offering EAs as Software-as-a-Service (SaaS), there is a challenge of choosing a suitable provider with Quality of Service (QoS) that meet the organizations’ customized requirements. The proposed model takes care of that and goes a step further to select the most affordable among a selected few of the CSPs. In the earlier stage, before developing the instrument and conducting the pilot test, the researchers conducted a structured interview with three experts to validate the proposed model. In conclusion, the validity and reliability of the instrument were tested through experts, typical respondents, and analyzed with SPSS 22. Results confirmed the validity of the proposed model and the validity and reliability of the instrument.
A Formal Property Verification for Aspect-Oriented Programs in Software Development
Software development for complex systems requires
efficient and automatic tools that can be used to verify the
satisfiability of some critical properties such as security ones. With
the emergence of Aspect-Oriented Programming (AOP), considerable
work has been done in order to better modularize the separation of
concerns in the software design and implementation. The goal is to
prevent the cross-cutting concerns to be scattered across the multiple
modules of the program and tangled with other modules. One of the
key challenges in the aspect-oriented programs is to be sure that all
the pieces put together at the weaving time ensure the satisfiability
of the overall system requirements. Our paper focuses on this problem and proposes a formal property
verification approach for a given property from the woven program.
The approach is based on the control flow graph (CFG) of the
woven program, and the use of a satisfiability modulo theories (SMT)
solver to check whether each property (represented par one aspect)
is satisfied or not once the weaving is done.
A Comprehensive Evaluation of Supervised Machine Learning for the Phase Identification Problem
Power distribution circuits undergo frequent network
topology changes that are often left undocumented. As a result, the
documentation of a circuit’s connectivity becomes inaccurate with
time. The lack of reliable circuit connectivity information is one of the
biggest obstacles to model, monitor, and control modern distribution
systems. To enhance the reliability and efficiency of electric power
distribution systems, the circuit’s connectivity information must be
updated periodically. This paper focuses on one critical component of
a distribution circuit’s topology - the secondary transformer to phase
association. This topology component describes the set of phase lines
that feed power to a given secondary transformer (and therefore a
given group of power consumers). Finding the documentation of this
component is call Phase Identification, and is typically performed
with physical measurements. These measurements can take time
lengths on the order of several months, but with supervised learning,
the time length can be reduced significantly. This paper compares
several such methods applied to Phase Identification for a large
range of real distribution circuits, describes a method of training
data selection, describes preprocessing steps unique to the Phase
Identification problem, and ultimately describes a method which
obtains high accuracy (> 96% in most cases, > 92% in the worst
case) using only 5% of the measurements typically used for Phase
Multidimensional Performance Tracking
In this study, a model, together with a software tool that implements it, has been developed to determine the performance ratings of employees in an organization operating in the information technology sector using the indicators obtained from employees' online study data. Weighted Sum (WS) Method and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method based on multidimensional decision making approach were used in the study. WS and TOPSIS methods provide multidimensional decision making (MDDM) methods that allow all dimensions to be evaluated together considering specific weights, allowing employees to objectively evaluate the problem of online performance tracking. The application of WS and TOPSIS mathematical methods, which can combine alternatives with a large number of dimensions and reach simultaneous solution, has been implemented through an online performance tracking software. In the application of WS and TOPSIS methods, objective dimension weights were calculated by using entropy information (EI) and standard deviation (SD) methods from the data obtained by employees' online performance tracking method, decision matrix was formed by using performance scores for each employee, and a single performance score was calculated for each employee. Based on the calculated performance score, employees were given a performance evaluation decision. The results of Pareto set evidence and comparative mathematical analysis validate that employees' performance preference rankings in WS and TOPSIS methods are closely related. This suggests the compatibility, applicability, and validity of the proposed method to the MDDM problems in which a large number of alternative and dimension types are taken into account. With this study, an objective, realistic, feasible and understandable mathematical method, together with a software tool that implements it has been demonstrated. This is considered to be preferable because of the subjectivity, limitations and high cost of the methods traditionally used in the measurement and performance appraisal in the information technology sector.
CompleX-Machine: An Automated Testing Tool Using X-Machine Theory
This paper is aimed at creating an Automatic Java X-Machine testing tool for software development. The nature of software development is changing; thus, the type of software testing tools required is also changing. Software is growing increasingly complex and, in part due to commercial impetus for faster software releases with new features and value, increasingly in danger of containing faults. These faults can incur huge cost for software development organisations and users; Cambridge Judge Business School’s research estimated the cost of software bugs to the global economy is $312 billion. Beyond the cost, faster software development methodologies and increasing expectations on developers to become testers is driving demand for faster, automated, and effective tools to prevent potential faults as early as possible in the software development lifecycle. Using X-Machine theory, this paper will explore a new tool to address software complexity, changing expectations on developers, faster development pressures and methodologies, with a view to reducing the huge cost of fixing software bugs.
Impact of Extended Enterprise Resource Planning in the Context of Cloud Computing on Industries and Organizations
The Extended Enterprise Resource Planning (ERPII)
system usually requires massive amounts of storage space, powerful
servers, and large upfront and ongoing investments to purchase and
manage the software and the related hardware which are not
affordable for organizations. In recent decades, organizations prefer
to adapt their business structures with new technologies for
remaining competitive in the world economy. Therefore, cloud
computing (which is one of the tools of information technology (IT))
is a modern system that reveals the next-generation application
architecture. Also, cloud computing has had some advantages that
reduce costs in many ways such as: lower upfront costs for all
computing infrastructure and lower cost of maintaining and
supporting. On the other hand, traditional ERPII is not responding for
huge amounts of data and relations between the organizations. In this
study, based on a literature study, ERPII is investigated in the context
of cloud computing where the organizations operate more efficiently.
Also, ERPII conditions have a response to needs of organizations in
large amounts of data and relations between the organizations.