International Science Index

40
10012043
An Effort at Improving Reliability of Laboratory Data in Titrimetric Analysis for Zinc Sulphate Tablets Using Validated Spreadsheet Calculators
Abstract:

The requirement for maintaining data integrity in laboratory operations is critical for regulatory compliance. Automation of procedures reduces incidence of human errors. Quality control laboratories located in low-income economies may face some barriers in attempts to automate their processes. Since data from quality control tests on pharmaceutical products are used in making regulatory decisions, it is important that laboratory reports are accurate and reliable. Zinc Sulphate (ZnSO4) tablets is used in treatment of diarrhea in pediatric population, and as an adjunct therapy for COVID-19 regimen. Unfortunately, zinc content in these formulations is determined titrimetrically; a manual analytical procedure. The assay for ZnSO4 tablets involves time-consuming steps that contain mathematical formulae prone to calculation errors. To achieve consistency, save costs, and improve data integrity, validated spreadsheets were developed to simplify the two critical steps in the analysis of ZnSO4 tablets: standardization of 0.1M Sodium Edetate (EDTA) solution, and the complexometric titration assay procedure. The assay method in the United States Pharmacopoeia was used to create a process flow for ZnSO4 tablets. For each step in the process, different formulae were input into two spreadsheets to automate calculations. Further checks were created within the automated system to ensure validity of replicate analysis in titrimetric procedures. Validations were conducted using five data sets of manually computed assay results. The acceptance criteria set for the protocol were met. Significant p-values (p < 0.05, α = 0.05, at 95% Confidence Interval) were obtained from students’ t-test evaluation of the mean values for manual-calculated and spreadsheet results at all levels of the analysis flow. Right-first-time analysis and principles of data integrity were enhanced by use of the validated spreadsheet calculators in titrimetric evaluations of ZnSO4 tablets. Human errors were minimized in calculations when procedures were automated in quality control laboratories. The assay procedure for the formulation was achieved in a time-efficient manner with greater level of accuracy. This project is expected to promote cost savings for laboratory business models.

39
10012044
The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination
Abstract:
The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.
38
10011948
A Retrospective Cross-Sectional Study on the Prevalence and Factors Associated with Virological Non-Suppression among HIV-Positive Adult Patients on Antiretroviral Therapy in Woliso Town, Oromia, Ethiopia
Abstract:
Background: HIV virological failure still remains a problem in HV/AIDS treatment and care. This study aimed to describe the prevalence and identify the factors associated with viral non-suppression among HIV-positive adult patients on antiretroviral therapy in Woliso Town, Oromia, Ethiopia. Methods: A retrospective cross-sectional study was conducted among 424 HIV-positive patient’s attending antiretroviral therapy (ART) in Woliso Town during the period from August 25, 2020 to August 30, 2020. Data collected from patient medical records were entered into Epi Info version 2.3.2.1 and exported to SPSS version 21.0 for analysis. Logistic regression analysis was done to identify factors associated with viral load non-suppression, and statistical significance of odds ratios were declared using 95% confidence interval and p-value < 0.05. Results: A total of 424 patients were included in this study. The mean age (± SD) of the study participants was 39.88 (± 9.995) years. The prevalence of HIV viral load non-suppression was 55 (13.0%) with 95% CI (9.9-16.5). Second-line ART treatment regimen (Adjusted Odds Ratio (AOR) = 8.98, 95% Confidence Interval (CI): 2.64, 30.58) and routine viral load testing (AOR = 0.01, 95% CI: 0.001, 0.02) were significantly associated with virological non-suppression. Conclusion: Virological non-suppression was high, which hinders the achievement of the third global 95 target. The second-line regimen and routine viral load testing were significantly associated with virological non-suppression. It suggests the need to assess the effectiveness of antiretroviral drugs for epidemic control. It also clearly shows the need to decentralize third-line ART treatment for those patients in need.
Paper Detail
33
downloads
37
10010490
Discovering Semantic Links Between Synonyms, Hyponyms and Hypernyms
Abstract:
This proposal aims for semantic enrichment between glossaries using the Simple Knowledge Organization System (SKOS) vocabulary to discover synonyms, hyponyms and hyperonyms semiautomatically, in Brazilian Portuguese, generating new semantic relationships based on WordNet. To evaluate the quality of this proposed model, experiments were performed by the use of two sets containing new relations, being one generated automatically and the other manually mapped by the domain expert. The applied evaluation metrics were precision, recall, f-score, and confidence interval. The results obtained demonstrate that the applied method in the field of Oil Production and Extraction (E&P) is effective, which suggests that it can be used to improve the quality of terminological mappings. The procedure, although adding complexity in its elaboration, can be reproduced in others domains.
Paper Detail
393
downloads
36
10009179
Probabilistic Life Cycle Assessment of the Nano Membrane Toilet
Abstract:
Developing countries are nowadays confronted with great challenges related to domestic sanitation services in view of the imminent water scarcity. Contemporary sanitation technologies established in these countries are likely to pose health risks unless waste management standards are followed properly. This paper provides a solution to sustainable sanitation with the development of an innovative toilet system, called Nano Membrane Toilet (NMT), which has been developed by Cranfield University and sponsored by the Bill & Melinda Gates Foundation. The particular technology converts human faeces into energy through gasification and provides treated wastewater from urine through membrane filtration. In order to evaluate the environmental profile of the NMT system, a deterministic life cycle assessment (LCA) has been conducted in SimaPro software employing the Ecoinvent v3.3 database. The particular study has determined the most contributory factors to the environmental footprint of the NMT system. However, as sensitivity analysis has identified certain critical operating parameters for the robustness of the LCA results, adopting a stochastic approach to the Life Cycle Inventory (LCI) will comprehensively capture the input data uncertainty and enhance the credibility of the LCA outcome. For that purpose, Monte Carlo simulations, in combination with an artificial neural network (ANN) model, have been conducted for the input parameters of raw material, produced electricity, NOX emissions, amount of ash and transportation of fertilizer. The given analysis has provided the distribution and the confidence intervals of the selected impact categories and, in turn, more credible conclusions are drawn on the respective LCIA (Life Cycle Impact Assessment) profile of NMT system. Last but not least, the specific study will also yield essential insights into the methodological framework that can be adopted in the environmental impact assessment of other complex engineering systems subject to a high level of input data uncertainty.
Paper Detail
468
downloads
35
10008659
Comparison of Diagnostic Performance of Soluble Transferrin Receptor and Soluble Transferrin Receptor-Ferritin Index Tests in the Diagnosis of Iron Deficiency Anemia
Abstract:

In this research article, a comprehensive analysis is performed to compare the diagnostic performance of soluble transferrin receptor (sTfR) and sTfR/log ferritin index tests in the differential diagnosis of iron deficiency anemia (IDA) and anemia of chronic disease (ACD). The analysis is performed for both sTfR and sTfR/log ferritin index using a set of 11 studies. The overall odds ratios for sTfR and sTfR/log ferritin index were 36.79 and 119.32 respectively, using 95% confidence interval. The relative sensitivity, specificity. positive likelihood ratio (LR) and negative LR values for sTfR in relation to sTfR/log ferritin index were 81% vs 85%, 84% vs 93%, 6.31 vs 13.95 and 0.18 vs 0.14 respectively. The summary receiver operating characteristic (SROC) curves are also plotted for both sTfR and sTfR/log ferritin index. The area under SROC curves for sTfR and sTfR/log ferritin index was found to be 0.9296 and 0.9825 respectively. Although both tests are useful, the sTfR/log ferritin index seems to be more effective when compared with sTfR.

Paper Detail
489
downloads
34
10007661
Comparison of Statins Dose Intensity on HbA1c Control in Outpatients with Type 2 Diabetes: A Prospective Cohort Study
Abstract:

The effect of statins dose intensity (SDI) on glycemic control in patients with existing diabetes is unclear. Also, there are many contradictory findings were reported in the literature; thus, it is limiting the possibility to draw conclusions. This project was designed to compare the effect of SDI on glycated hemoglobin (HbA1c%) control in outpatients with Type 2 diabetes in the endocrine clinic at Hospital Pulau Pinang, Malaysia, between July 2015 and August 2016. A prospective cohort study was conducted, where records of 345 patients with Type 2 diabetes (Moderate-SDI group 289 patients and high-SDI cohort 56 patients) were reviewed to identify demographics and laboratory tests. The target of glycemic control (HbA1c < 7% for patient < 65 years, and < 8% for patient ≥ 65 years) was estimated, and the results were presented as descriptive statistics. From 289 moderate-SDI cohorts with a mean age of 57.3 ± 12.4 years, only 86 (29.8%) cases were shown to have controlled glycemia, while there were 203 (70.2%) cases with uncontrolled glycemia with confidence interval (CI) of 95% (6.2–10.8). On the other hand, the high-SDI group of 56 patients with Type 2 diabetes with a mean age 57.7±12.4 years is distributed among 11 (19.6%) patients with controlled diabetes, and 45 (80.4%) of them had uncontrolled glycemia, CI: 95% (7.1–11.9). The study has demonstrated that the relative risk (RR) of uncontrolled glycemia in patients with Type 2 diabetes that used high-SDI is 1.15, and the excessive relative risk (ERR) is 15%. The absolute risk (AR) is 10.2%, and the number needed to harm (NNH) is 10. Outpatients with Type 2 diabetes who use high-SDI of statin have a higher risk of uncontrolled glycemia than outpatients who had been treated with a moderate-SDI.

Paper Detail
825
downloads
33
10007761
Integrating Geographic Information into Diabetes Disease Management
Abstract:

Background: Traditional chronic disease management did not pay attention to effects of geographic factors on the compliance of treatment regime, which resulted in geographic inequality in outcomes of chronic disease management. This study aims to examine the geographic distribution and clustering of quality indicators of diabetes care. Method: We first extracted address, demographic information and quality of care indicators (number of visits, complications, prescription and laboratory records) of patients with diabetes for 2014 from medical information system in a medical center in Tainan City, Taiwan, and the patients’ addresses were transformed into district- and village-level data. We then compared the differences of geographic distribution and clustering of quality of care indicators between districts and villages. Despite the descriptive results, rate ratios and 95% confidence intervals (CI) were estimated for indices of care in order to compare the quality of diabetes care among different areas. Results: A total of 23,588 patients with diabetes were extracted from the hospital data system; whereas 12,716 patients’ information and medical records were included to the following analysis. More than half of the subjects in this study were male and between 60-79 years old. Furthermore, the quality of diabetes care did indeed vary by geographical levels. Thru the smaller level, we could point out clustered areas more specifically. Fuguo Village (of Yongkang District) and Zhiyi Village (of Sinhua District) were found to be “hotspots” for nephropathy and cerebrovascular disease; while Wangliau Village and Erwang Village (of Yongkang District) would be “coldspots” for lowest proportion of ≥80% compliance to blood lipids examination. On the other hand, Yuping Village (in Anping District) was the area with the lowest proportion of ≥80% compliance to all laboratory examination. Conclusion: In spite of examining the geographic distribution, calculating rate ratios and their 95% CI could also be a useful and consistent method to test the association. This information is useful for health planners, diabetes case managers and other affiliate practitioners to organize care resources to the areas most needed.

Paper Detail
648
downloads
32
10005164
The Association between C-Reactive Protein and Hypertension of Different United States Participants Categorized by Ethnicity: Applying the National Health and Nutrition Examination Survey from 1999-2010
Abstract:
Objectives: The main objective of this study was to examine the association between the elevated level of C-reactive protein (CRP) and incidence of hypertension before and after adjustments for age, BMI, gender, SES, smoking, diabetes, cholesterol LDL and cholesterol HDL, and to determine whether the association differs by race. Method: Cross sectional data for participants from aged 17 years to 74 years, included in The National Health and Nutrition Examination Survey (NHANES) from 1999 to 2010 were analyzed. The CRP level was classified into three categories (> 3 mg/L, between 1 mg/L and 3 mg/L, and < 3 mg/L). Blood pressure categorization was done using JNC 7 indicator. Hypertension is defined as either systolic blood pressure (SBP) of 140 mmHg or more and diastolic blood pressure (DBP) of 90 mmHg or more, otherwise a self-reported prior diagnosis by a physician. Pre-hypertension was defined as 139 ≥ SBP > 120 or 89 ≥ DBP >80. Multinominal regression model was undertaken to measure the association between CRP level and hypertension. Results: In univariable models, CRP concentrations > 3 mg/L were associated with a 73% greater risk of incident hypertension compared with CRP concentrations < 1 mg/L (Hypertension: odds ratio [OR] = 1.73; 95% confidence interval [CI], 1.50-1.99). Ethnic comparisons showed that American Mexicans had the highest risk of incident hypertension (OR = 2.39; 95% CI, 2.21-2.58). This risk was statistically insignificant after controlling by other variables (Hypertension: OR = 0.75; 95% CI, 0.52-1.08), or categorized by race [American Mexican: OR= 1.58; 95% CI, 0.58-4.26, Other Hispanic: OR = 0.87; 95% CI, 0.19-4.42, Non-Hispanic white: OR = 0.90; 95% CI, 0.50-1.59, Non-Hispanic Black: OR = 0.44; 95% CI, 0.22-0.87. The same results were found for pre-hypertension, and the Non-Hispanic black segment showed the highest significant risk for Pre-Hypertension (OR = 1.60; 95% CI, 1.26-2.03). When CRP concentrations were between 1.0 and 3.0 mg/L in unadjusted models, prehypertension was associated with higher likelihood of elevated CRP (OR = 1.37; 95% CI, 1.15-1.62). The same relationship was maintained in Non-Hispanic white, Non-Hispanic black, and other race (Non-Hispanic white: OR = 1.24; 95% CI, 1.03-1.48, Non-Hispanic black: OR = 1.60; 95% CI, 1.27-2.03, other race: OR = 2.50; 95% CI, 1.32-4.74) while the association was insignificant with American Mexican and other Hispanic. In the adjusted model, the relationship between CRP and prehypertension were no longer available. Contrary, hypertension was not independently associated with elevated CRP, and the results were the same after being grouped by race or adjustments for the possible confounder variables. The same results were obtained when SBP or DBP were on a continuous measure. Conclusions: This study confirmed the existence of an association between hypertension, prehypertension and elevated level of CRP, however this association was no longer available after adjusting by other variables. Ethic group differences were statistically significant at the univariable models, while it disappeared after controlling by other variables. 
Paper Detail
1032
downloads
31
10004123
Statistical Analysis and Optimization of a Process for CO2 Capture
Abstract:

CO2 capture and storage technologies play a significant role in contributing to the control of climate change through the reduction of carbon dioxide emissions into the atmosphere. The present study evaluates and optimizes CO2 capture through a process, where carbon dioxide is passed into pH adjusted high salinity water and reacted with sodium chloride to form a precipitate of sodium bicarbonate. This process is based on a modified Solvay process with higher CO2 capture efficiency, higher sodium removal, and higher pH level without the use of ammonia. The process was tested in a bubble column semi-batch reactor and was optimized using response surface methodology (RSM). CO2 capture efficiency and sodium removal were optimized in terms of major operating parameters based on four levels and variables in Central Composite Design (CCD). The operating parameters were gas flow rate (0.5–1.5 L/min), reactor temperature (10 to 50 oC), buffer concentration (0.2-2.6%) and water salinity (25-197 g NaCl/L). The experimental data were fitted to a second-order polynomial using multiple regression and analyzed using analysis of variance (ANOVA). The optimum values of the selected variables were obtained using response optimizer. The optimum conditions were tested experimentally using desalination reject brine with salinity ranging from 65,000 to 75,000 mg/L. The CO2 capture efficiency in 180 min was 99% and the maximum sodium removal was 35%. The experimental and predicted values were within 95% confidence interval, which demonstrates that the developed model can successfully predict the capture efficiency and sodium removal using the modified Solvay method.

Paper Detail
1468
downloads
30
10004144
Modelling Phytoremediation Rates of Aquatic Macrophytes in Aquaculture Effluent
Abstract:

Pollutants from aquacultural practices constitute environmental problems and phytoremediation could offer cheaper environmentally sustainable alternative since equipment using advanced treatment for fish tank effluent is expensive to import, install, operate and maintain, especially in developing countries. The main objective of this research was, therefore, to develop a mathematical model for phytoremediation by aquatic plants in aquaculture wastewater. Other objectives were to evaluate the retention times on phytoremediation rates using the model and to measure the nutrient level of the aquaculture effluent and phytoremediation rates of three aquatic macrophytes, namely; water hyacinth (Eichornia crassippes), water lettuce (Pistial stratoites) and morning glory (Ipomea asarifolia). A completely randomized experimental design was used in the study. Approximately 100 g of each macrophyte were introduced into the hydroponic units and phytoremediation indices monitored at 8 different intervals from the first to the 28th day. The water quality parameters measured were pH and electrical conductivity (EC). Others were concentration of ammonium–nitrogen (NH4+ -N), nitrite- nitrogen (NO2- -N), nitrate- nitrogen (NO3- -N), phosphate –phosphorus (PO43- -P), and biomass value. The biomass produced by water hyacinth was 438.2 g, 600.7 g, 688.2 g and 725.7 g at four 7–day intervals. The corresponding values for water lettuce were 361.2 g, 498.7 g, 561.2 g and 623.7 g and for morning glory were 417.0 g, 567.0 g, 642.0 g and 679.5g. Coefficient of determination was greater than 80% for EC, TDS, NO2- -N, NO3- -N and 70% for NH4+ -N using any of the macrophytes and the predicted values were within the 95% confidence interval of measured values. Therefore, the model is valuable in the design and operation of phytoremediation systems for aquaculture effluent.

Paper Detail
1175
downloads
29
10001515
Relevance of the Variation in the Angulation of Palatal Throat Form to the Orientation of the Occlusal Plane: A Cephalometric Study
Abstract:
The posterior reference for the ala tragal line is a cause of confusion, with different authors suggesting different locations as to the superior, middle or inferior part of the tragus. This study was conducted on 200 subjects to evaluate if any correlation exists between the variation of angulation of palatal throat form and the relative parallelism of occlusal plane to ala-tragal line at different tragal levels. A custom made Occlusal Plane Analyzer was used to check the parallelism between the ala-tragal line and occlusal plane. A lateral cephalogram was shot for each subject to measure the angulation of the palatal throat form. Fisher’s exact test was used to evaluate the correlation between the angulation of the palatal throat form and the relative parallelism of occlusal plane to the ala tragal line. Also, a classification was formulated for the palatal throat form, based on confidence interval. From the results of the study, the inferior part, middle part and superior part of the tragus were seen as the reference points in 49.5%, 32% and 18.5% of the subjects respectively. Class I palatal throat form (41degree-50 degree), Class II palatal throat form (below 41 degree) and Class III palatal throat form (above 50 degree) were seen in 42%, 43% and 15% of the subjects respectively. It was also concluded that there is no significant correlation between the variation in the angulations of the palatal throat form and the relative parallelism of occlusal plane to the ala-tragal line.
Paper Detail
2332
downloads
28
10001573
Forecast of the Small Wind Turbines Sales with Replacement Purchases and with or without Account of Price Changes
Abstract:
The purpose of the paper is to estimate the US small wind turbines market potential and forecast the small wind turbines sales in the US. The forecasting method is based on the application of the Bass model and the generalized Bass model of innovations diffusion under replacement purchases. In the work an exponential distribution is used for modeling of replacement purchases. Only one parameter of such distribution is determined by average lifetime of small wind turbines. The identification of the model parameters is based on nonlinear regression analysis on the basis of the annual sales statistics which has been published by the American Wind Energy Association (AWEA) since 2001 up to 2012. The estimation of the US average market potential of small wind turbines (for adoption purchases) without account of price changes is 57080 (confidence interval from 49294 to 64866 at P = 0.95) under average lifetime of wind turbines 15 years, and 62402 (confidence interval from 54154 to 70648 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 90,7%, while in the second - 91,8%. The effect of the wind turbines price changes on their sales was estimated using generalized Bass model. This required a price forecast. To do this, the polynomial regression function, which is based on the Berkeley Lab statistics, was used. The estimation of the US average market potential of small wind turbines (for adoption purchases) in that case is 42542 (confidence interval from 32863 to 52221 at P = 0.95) under average lifetime of wind turbines 15 years, and 47426 (confidence interval from 36092 to 58760 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 95,3%, while in the second – 95,3%.
Paper Detail
1542
downloads
27
10003643
Approximate Confidence Interval for Effect Size Base on Bootstrap Resampling Method
Authors:
Abstract:
This paper presents the confidence intervals for the effect size base on bootstrap resampling method. The meta-analytic confidence interval for effect size is proposed that are easy to compute. A Monte Carlo simulation study was conducted to compare the performance of the proposed confidence intervals with the existing confidence intervals. The best confidence interval method will have a coverage probability close to 0.95. Simulation results have shown that our proposed confidence intervals perform well in terms of coverage probability and expected length.
Paper Detail
903
downloads
26
9998844
An Enhanced Floor Estimation Algorithm for Indoor Wireless Localization Systems Using Confidence Interval Approach
Abstract:

Indoor wireless localization systems have played an important role to enhance context-aware services. Determining the position of mobile objects in complex indoor environments, such as those in multi-floor buildings, is very challenging problems. This paper presents an effective floor estimation algorithm, which can accurately determine the floor where mobile objects located. The proposed algorithm is based on the confidence interval of the summation of online Received Signal Strength (RSS) obtained from the IEEE 802.15.4 Wireless Sensor Networks (WSN).We compare the performance of the proposed algorithm with those of other floor estimation algorithms in literature by conducting a real implementation of WSN in our facility. The experimental results and analysis showed that the proposed floor estimation algorithm outperformed the other algorithms and provided highest percentage of floor accuracy up to 100% with 95-percent confidence interval.

Paper Detail
2770
downloads
25
9997937
Shoreline Change Estimation from Survey Image Coordinates and Neural Network Approximation
Abstract:

Shoreline erosion problems caused by global warming and sea level rising may result in losing of land areas, so it should be examined regularly to reduce possible negative impacts. Initially in this study, three sets of survey images obtained from the years of 1990, 2001, and 2010, respectively, are digitalized by using graphical software to establish the spatial coordinates of six major beaches around the island of Taiwan. Then, by overlaying the known multi-period images, the change of shoreline can be observed from their distribution of coordinates. In addition, the neural network approximation is used to develop a model for predicting shoreline variation in the years of 2015 and 2020. The comparison results show that there is no significant change of total sandy area for all beaches in the three different periods. However, the prediction results show that two beaches may exhibit an increasing of total sandy areas under a statistical 95% confidence interval. The proposed method adopted in this study may be applicable to other shorelines of interest around the world.

Paper Detail
1762
downloads
24
9997740
Comparison of Prognostic Models in Different Scenarios of Shoreline Position on Ponta Negra Beach in Northeastern Brazil
Abstract:

Prognostic studies of the shoreline are of utmost importance for Ponta Negra Beach, located in Natal, Northeastern Brazil, where the infrastructure recently built along the shoreline is severely affected by flooding and erosion. This study compares shoreline predictions using three linear regression methods (LMS, LRR and WLR) and tries to discern the best method for different shoreline position scenarios. The methods have shown erosion on the beach in each of the scenarios tested, even in less intense dynamic conditions. The WLA_A with confidence interval of 95% was the well-adjusted model and calculated a retreat of -1.25 m/yr to -2.0 m/yr in hot spot areas. The change of the shoreline on Ponta Negra Beach can be measured as a negative exponential curve. Analysis of these methods has shown a correlation with the morphodynamic stage of the beach.

Paper Detail
2270
downloads
23
9997595
Reliability Analysis of k-out-of-n : G System Using Triangular Intuitionistic Fuzzy Numbers
Abstract:

In the present paper, we analyze the vague reliability of k-out-of-n : G system (particularly, series and parallel system) with independent and non-identically distributed components, where the reliability of the components are unknown. The reliability of each component has been estimated using statistical confidence interval approach. Then we converted these statistical confidence interval into triangular intuitionistic fuzzy numbers. Based on these triangular intuitionistic fuzzy numbers, the reliability of the k-out-of-n : G system has been calculated. Further, in order to implement the proposed methodology and to analyze the results of k-out-of-n : G system, a numerical example has been provided.

Paper Detail
2687
downloads
22
17139
Profit Optimization for Solar Plant Electricity Production
Abstract:

In this paper a stochastic scenario-based model predictive control applied to molten salt storage systems in concentrated solar tower power plant is presented. The main goal of this study is to build up a tool to analyze current and expected future resources for evaluating the weekly power to be advertised on electricity secondary market. This tool will allow plant operator to maximize profits while hedging the impact on the system of stochastic variables such as resources or sunlight shortage.

Solving the problem first requires a mixed logic dynamic modeling of the plant. The two stochastic variables, respectively the sunlight incoming energy and electricity demands from secondary market, are modeled by least square regression. Robustness is achieved by drawing a certain number of random variables realizations and applying the most restrictive one to the system. This scenario approach control technique provides the plant operator a confidence interval containing a given percentage of possible stochastic variable realizations in such a way that robust control is always achieved within its bounds. The results obtained from many trajectory simulations show the existence of a ‘’reliable’’ interval, which experimentally confirms the algorithm robustness.

Paper Detail
1666
downloads
21
16712
Confidence Interval for the Inverse of a Normal Mean with a Known Coefficient of Variation
Abstract:

In this paper, we propose two new confidence intervals for the inverse of a normal mean with a known coefficient of variation. One of new confidence intervals for the inverse of a normal mean with a known coefficient of variation is constructed based on the pivotal statistic Z where Z is a standard normal distribution and another confidence interval is constructed based on the generalized confidence interval, presented by Weerahandi. We examine the performance of these confidence intervals in terms of coverage probabilities and average lengths via Monte Carlo simulation.

Paper Detail
2283
downloads
20
16718
Confidence Intervals for the Coefficients of Variation with Bounded Parameters
Abstract:

In many practical applications in various areas, such as engineering, science and social science, it is known that there exist bounds on the values of unknown parameters. For example, values of some measurements for controlling machines in an industrial process, weight or height of subjects, blood pressures of patients and retirement ages of public servants. When interval estimation is considered in a situation where the parameter to be estimated is bounded, it has been argued that the classical Neyman procedure for setting confidence intervals is unsatisfactory. This is due to the fact that the information regarding the restriction is simply ignored. It is, therefore, of significant interest to construct confidence intervals for the parameters that include the additional information on parameter values being bounded to enhance the accuracy of the interval estimation. Therefore in this paper, we propose a new confidence interval for the coefficient of variance where the population mean and standard deviation are bounded. The proposed interval is evaluated in terms of coverage probability and expected length via Monte Carlo simulation.  

Paper Detail
3832
downloads
19
16732
On Simple Confidence Intervals for the Normal Mean with Known Coefficient of Variation
Abstract:

In this paper we proposed the new confidence interval for the normal population mean with known coefficient of variation. In practice, this situation occurs normally in environment and agriculture sciences where we know the standard deviation is proportional to the mean. As a result, the coefficient of variation of is known. We propose the new confidence interval based on the recent work of Khan [3] and this new confidence interval will compare with our previous work, see, e.g. Niwitpong [5]. We derive analytic expressions for the coverage probability and the expected length of each confidence interval. A numerical method will be used to assess the performance of these intervals based on their expected lengths.

Paper Detail
1490
downloads
18
17024
Maximum Likelihood Estimation of Burr Type V Distribution under Left Censored Samples
Abstract:

The paper deals with the maximum likelihood estimation of the parameters of the Burr type V distribution based on left censored samples. The maximum likelihood estimators (MLE) of the parameters have been derived and the Fisher information matrix for the parameters of the said distribution has been obtained explicitly. The confidence intervals for the parameters have also been discussed. A simulation study has been conducted to investigate the performance of the point and interval estimates.

Paper Detail
1408
downloads
17
10153
Role of Oxidative DNA Damage in Pathogenesis of Diabetic Neuropathy
Abstract:
Oxidative stress is considered to be the cause for onset and the progression of type 2 diabetes mellitus (T2DM) and complications including neuropathy. It is a deleterious process that can be an important mediator of damage to cell structures: protein, lipids and DNA. Data suggest that in patients with diabetes and diabetic neuropathy DNA repair is impaired, which prevents effective removal of lesions. Objective: The aim of our study was to evaluate the association of the hOGG1 (326 Ser/Cys) and XRCC1 (194 Arg/Trp, 399 Arg/Gln) gene polymorphisms whose protein is involved in the BER pathway with DNA repair efficiency in patients with diabetes type 2 and diabetic neuropathy compared to the healthy subjects. Genotypes were determined by PCR-RFLP analysis in 385 subjects, including 117 with type 2 diabetes, 56 with diabetic neuropathy and 212 with normal glucose metabolism. The polymorphisms studied include codon 326 of hOGG1 and 194, 399 of XRCC1 in the base excision repair (BER) genes. Comet assay was carried out using peripheral blood lymphocytes from the patients and controls. This test enabled the evaluation of DNA damage in cells exposed to hydrogen peroxide alone and in the combination with the endonuclease III (Nth). The results of the analysis of polymorphism were statistically examination by calculating the odds ratio (OR) and their 95% confidence intervals (95% CI) using the ¤ç2-tests. Our data indicate that patients with diabetes mellitus type 2 (including those with neuropathy) had higher frequencies of the XRCC1 399Arg/Gln polymorphism in homozygote (GG) (OR: 1.85 [95% CI: 1.07-3.22], P=0.3) and also increased frequency of 399Gln (G) allele (OR: 1.38 [95% CI: 1.03-1.83], P=0.3). No relation to other polymorphisms with increased risk of diabetes or diabetic neuropathy. In T2DM patients complicated by neuropathy, there was less efficient repair of oxidative DNA damage induced by hydrogen peroxide in both the presence and absence of the Nth enzyme. The results of our study suggest that the XRCC1 399 Arg/Gln polymorphism is a significant risk factor of T2DM in Polish population. Obtained data suggest a decreased efficiency of DNA repair in cells from patients with diabetes and neuropathy may be associated with oxidative stress. Additionally, patients with neuropathy are characterized by even greater sensitivity to oxidative damage than patients with diabetes, which suggests participation of free radicals in the pathogenesis of neuropathy.
Paper Detail
1767
downloads
16
13610
Confidence Intervals for the Normal Mean with Known Coefficient of Variation
Abstract:

In this paper we proposed two new confidence intervals for the normal population mean with known coefficient of variation. This situation occurs normally in environment and agriculture experiments where the scientist knows the coefficient of variation of their experiments. We propose two new confidence intervals for this problem based on the recent work of Searls [5] and the new method proposed in this paper for the first time. We derive analytic expressions for the coverage probability and the expected length of each confidence interval. Monte Carlo simulation will be used to assess the performance of these intervals based on their expected lengths.

Paper Detail
1354
downloads
15
8600
Optimization of Process Parameters of Pressure Die Casting using Taguchi Methodology
Abstract:

The present work analyses different parameters of pressure die casting to minimize the casting defects. Pressure diecasting is usually applied for casting of aluminium alloys. Good surface finish with required tolerances and dimensional accuracy can be achieved by optimization of controllable process parameters such as solidification time, molten temperature, filling time, injection pressure and plunger velocity. Moreover, by selection of optimum process parameters the pressure die casting defects such as porosity, insufficient spread of molten material, flash etc. are also minimized. Therefore, a pressure die casting component, carburetor housing of aluminium alloy (Al2Si2O5) has been considered. The effects of selected process parameters on casting defects and subsequent setting of parameters with the levels have been accomplished by Taguchi-s parameter design approach. The experiments have been performed as per the combination of levels of different process parameters suggested by L18 orthogonal array. Analyses of variance have been performed for mean and signal-to-noise ratio to estimate the percent contribution of different process parameters. Confidence interval has also been estimated for 95% consistency level and three conformational experiments have been performed to validate the optimum level of different parameters. Overall 2.352% reduction in defects has been observed with the help of suggested optimum process parameters.

Paper Detail
6903
downloads
14
2302
Availability of Sports Facilities does not explain the Association between Economic Environment and Physical Inactivity in a Southern European city
Abstract:
This paper evaluates the association between economic environment in the districts of Madrid (Spain) and physical inactivity, using income per capita as indicator of economic environment. The analysis included 6,601 individuals aged 16 to 74 years. The measure of association estimated was the prevalence odds ratio for physical inactivity by income per capita. After adjusting for sex, age, and individual socioeconomic characteristics, people living in the districts with the lowest per capita income had an odds ratio for physical inactivity 1.58 times higher (95% confidence interval 1.35 to 1.85) than those living in districts with the highest per capita income. Additional adjustment for the availability of sports facilities in each district did not decrease the magnitude of the association. These findings show that the widely believed assumption that the availability of sports and recreational facilities, as a possible explanation for the relation between economic environment and physical inactivity, cannot be considered a universal observation.
Paper Detail
1515
downloads
13
11843
Bootstrap Confidence Intervals and Parameter Estimation for Zero Inflated Strict Arcsine Model
Abstract:

Zero inflated Strict Arcsine model is a newly developed model which is found to be appropriate in modeling overdispersed count data. In this study, maximum likelihood estimation method is used in estimating the parameters for zero inflated strict arcsine model. Bootstrapping is then employed to compute the confidence intervals for the estimated parameters.

Paper Detail
2019
downloads
12
8809
Confidence Intervals for Double Exponential Distribution: A Simulation Approach
Authors:
Abstract:
The double exponential model (DEM), or Laplace distribution, is used in various disciplines. However, there are issues related to the construction of confidence intervals (CI), when using the distribution.In this paper, the properties of DEM are considered with intention of constructing CI based on simulated data. The analysis of pivotal equations for the models here in comparisons with pivotal equations for normal distribution are performed, and the results obtained from simulation data are presented.
Paper Detail
3235
downloads
11
160
A New Method in Short-Term Heart Rate Variability — Five-Class Density Histogram
Abstract:

A five-class density histogram with an index named cumulative density was proposed to analyze the short-term HRV. 150 subjects participated in the test, falling into three groups with equal numbers -- the healthy young group (Young), the healthy old group (Old), and the group of patients with congestive heart failure (CHF). Results of multiple comparisons showed a significant differences of the cumulative density in the three groups, with values 0.0238 for Young, 0.0406 for Old and 0.0732 for CHF (p<0.001). After 7 days and 14 days, 46 subjects from the Young and Old groups were retested twice following the same test protocol. Results showed good-to-excellent interclass correlations (ICC=0.783, 95% confidence interval 0.676-0.864). The Bland-Altman plots were used to reexamine the test-retest reliability. In conclusion, the method proposed could be a valid and reliable method to the short-term HRV assessment.

Paper Detail
1600
downloads