A well-known hazardous metal and top contaminant in wastewater is hexavalent chromium. The two forms of most commonly found chromium are chromate ( CrO 4 2− ) and dichromate ( Cr 2 O 7 2− ). Leather tanning, cooling tower blow-down, plating, electroplating, rinse water sources, anodizing baths etc. are the main sources of Cr (VI) contamination. The Cr (VI) is not only non-biodegradable in the environment but also carcinogenic to living population. It is still difficult to treat Cr contaminated waste water effectively, safely, eco-friendly, and economically. As a result, many techniques have been used to treat Cr (VI)-polluted wastewater, including adsorption, chemical precipitation, coagulation, ion-exchange, and filtration. Among these practices, the most practical method is adsorption for the removal of Cr (VI) from aqueous solutions, which has gained widespread acceptance due to the ease of use and affordability of the equipment and adsorbent. It has been revealed that Fe-based adsorbents’ oxides and hydroxides have high adsorptive potential to lower Cr (VI) content below the advised threshold. Fe-based adsorbents were also discovered to be relatively cheap and toxic-free in Cr (VI) treatment. Fe-based adsorbents are commonly utilized in industry. It has been discovered that nanoparticles of Fe-, Ti-, and Cu-based adsorbents have a better capacity to remove Cr (VI). Cr (VI) was effectively removed from contaminated water using mixed element-based adsorbents (Fe-Mn, Fe-Ti, Fe-Cu, Fe-Zr, Fe-Cu-Y, Fe-Mg, etc.). Initial findings suggest that Cr (VI) removal from wastewater may be accomplished by using magnesium ferrite nanomaterials as an efficient adsorbent.
Analysis of free fall and acceleration of the mass on the Earth shows that using abstract entities such as absolute space or inertial space to explain mass dynamics leads to the violation of the principle of action and reaction. Many scientists including Newton, Mach, and Einstein recognized that inertial force has no reaction that originates on mass. Einstein calls the lack of reaction to the inertial force a serious criticism of the space-time continuum concept. Presented is the hypothesis that the inertial force develops in an interaction of two masses via the force field. The inertial force created by such a field has reaction force. The dynamic gravitational field predicted is strong enough to be detected in the laboratory. This article describes the laboratory experiment which can prove or disprove the hypothesis of the dynamic gravitational field. The inertial force, calculated using the equation for the dynamic gravitational field, agrees with the behavior of inertial force observed in the experiments on the Earth. The movement of the planets in our solar system calculated using that equation is the same as that calculated using Newton’s method. The space properties calculated by the candidate equation explain the aberration of light and the results of light propagation experiments. The dynamic gravitational field can explain the discrepancy between the observed velocity of stars in the galaxy and those predicted by Newton’s theory of gravitation without the need for the dark matter hypothesis.
Credit card fraud remains a significant challenge, with financial losses and consumer protection at stake. This study addresses the need for practical, real-time fraud detection methodologies. Using a Kaggle credit card dataset, I tackle class imbalance using the Synthetic Minority Oversampling Technique (SMOTE) to enhance modeling efficiency. I compare several machine learning algorithms, including Logistic Regression, Linear Discriminant Analysis, K-nearest Neighbors, Classification and Regression Tree, Naive Bayes, Support Vector, Random Forest, XGBoost, and Light Gradient-Boosting Machine to classify transactions as fraud or genuine. Rigorous evaluation metrics, such as AUC, PRAUC, F1, KS, Recall, and Precision, identify the Random Forest as the best performer in detecting fraudulent activities. The Random Forest model successfully identifies approximately 92% of transactions scoring 90 and above as fraudulent, equating to a detection rate of over 70% for all fraudulent transactions in the test dataset. Moreover, the model captures more than half of the fraud in each bin of the test dataset. SHAP values provide model explainability, with the SHAP summary plot highlighting the global importance of individual features, such as “V12” and “V14”. SHAP force plots offer local interpretability, revealing the impact of specific features on individual predictions. This study demonstrates the potential of machine learning, particularly the Random Forest model, for real-time credit card fraud detection, offering a promising approach to mitigate financial losses and protect consumers.
Many domains, including communication, signal processing, and image processing, use the Fourier Transform as a mathematical tool for signal analysis. Although it can analyze signals with steady and transitory properties, it has limits. The Wavelet Packet Decomposition (WPD) is a novel technique that we suggest in this study as a way to improve the Fourier Transform and get beyond these drawbacks. In this experiment, we specifically considered the utilization of Daubechies level 4 for the wavelet transformation. The choice of Daubechies level 4 was motivated by several reasons. Daubechies wavelets are known for their compact support, orthogonality, and good time-frequency localization. By choosing Daubechies level 4, we aimed to strike a balance between preserving important transient information and avoiding excessive noise or oversmoothing in the transformed signal. Then we compared the outcomes of our suggested approach to the conventional Fourier Transform using a non-stationary signal. The findings demonstrated that the suggested method offered a more accurate representation of non-stationary and transient signals in the frequency domain. Our method precisely showed a 12% reduction in MSE and a 3% rise in PSNR for the standard Fourier transform, as well as a 35% decrease in MSE and an 8% increase in PSNR for voice signals when compared to the traditional wavelet packet decomposition method.
Wouladje CabrelGolden Tendekai MumanikidzwaJianguo ShenYutong Yan
Pricing strategies can have a huge impact on a company’s success. This paper focuses on the advantages and disadvantages of using artificial intelligence in dynamic pricing strategies. A good understanding of the possible benefits and challenges will help companies to understand the impact of their chosen pricing strategies. AI-driven Dynamic pricing has great opportunities to increase a firm’s profits. Firms can benefit from personalized pricing based on personal behavior and characteristics, as well as cost reduction by increasing efficiency and reducing the need to use manual work and automation. However, AI-driven dynamic rewarding can have a negative impact on customers’ perception of trust, fairness and transparency. Since price discrimination is used, ethical issues such as privacy and equity may arise. Understanding the businesses and customers that determine pricing strategy is so important that one cannot exist without the other. It will provide a comprehensive overview of the main advantages and disadvantages of AI-assisted dynamic pricing strategy. The main objective of this research is to uncover the most notable advantages and disadvantages of implementing AI-enabled dynamic pricing strategies. Future research can extend the understanding of algorithmic pricing through case studies. In this way, new, practical implications can be developed in the future. It is important to investigate how issues related to customers’ trust and feelings of unfairness can be mitigated, for example by price framing.
Although there are many measures of variability for qualitative variables, they are little used in social research, nor are they included in statistical software. The aim of this article is to present six measures of variation for qualitative variables of simple calculation, as well as to facilitate their use by means of the R software. The measures considered are, on the one hand, Freemans variation ratio, Morals universal variation ratio, Kvalseths standard deviation from the mode, and Wilcoxs variation ratio which are most affected by proximity to a constant random variable, where the measures of variability for qualitative variables reach their minimum value of 0. On the other hand, the Gibbs-Poston index of qualitative variation and Shannons relative entropy are included, which are more affected by the proximity to a uniform distribution, where the measures of variability for qualitative variables reach their maximum value of 1. Point and interval estimation are addressed. Bootstrap by the percentile and bias-corrected and accelerated percentile methods are used to obtain confidence intervals. Two calculation situations are presented: with a sample mode and with two or more modes. The standard deviation from the mode among the six considered measures, and the universal variation ratio among the three variation ratios, are particularly recommended for use.
In clinical practice, dentists sometimes encounter phenomena that cannot be explained by modern western medical concepts;for example, the patient’s medical symptoms improve by bringing medicines or dentures close to the body. Although it seems difficult to completely elucidate the mechanism through modern western medicine, it can be explained using quantum mechanics. The quantum, the smallest unit of matter composition, exhibits wave-particle duality. The fact that symptoms can be improved simply by bringing dentures or medicines closer to the body indicates that the waves emitted by dentures or medicines interfere with the pathological waves emitted by the pathological site. Thus, the pathological waves are deformed and lead to a change in symptoms. In this way, quantum theory can explain phenomena that are difficult to elucidate in conventional medicine, which are encountered in clinical practice. So far, the author has presented a case of difficulty in raising the upper limb where the symptoms improved without the need for dentures in the mouth by adjusting the dentures outside the mouth. This time, the author would like to introduce a case which the patient’s knee pain improved by adjusting the dentures outside the mouth.
Walkability is an essential aspect of urban transportation systems. Properly designed walking paths can enhance transportation safety, encourage pedestrian activity, and improve community quality of life. This, in turn, can help achieve sustainable development goals in urban areas. This pilot study uses wearable technology data to present a new method for measuring pedestrian stress in urban environments and the results were presented as an interactive geographic information system map to support risk-informed decision-making. The approach involves analyzing data from wearable devices using heart rate variability (RMSSD and slope analysis) to identify high-stress locations. This data-driven approach can help urban planners and safety experts identify and address pedestrian stressors, ultimately creating safer, more walkable cities. The study addresses a significant challenge in pedestrian safety by providing insights into factors and locations that trigger stress in pedestrians. During the pilot study, high-stress pedestrian experiences were identified due to issues like pedestrian-scooter interaction on pedestrian paths, pedestrian behavior around high foot traffic areas, and poor visibility at pedestrian crossings due to inadequate lighting.
Ishita DashRachael Anne MuscatelloMark D. AbkowitzElla R. MostollerMike Sewell
In light of the rapid growth and development of social media, it has become the focus of interest in many different scientific fields. They seek to extract useful information from it, and this is called (knowledge), such as extracting information related to people’s behaviors and interactions to analyze feelings or understand the behavior of users or groups, and many others. This extracted knowledge has a very important role in decision-making, creating and improving marketing objectives and competitive advantage, monitoring events, whether political or economic, and development in all fields. Therefore, to extract this knowledge, we need to analyze the vast amount of data found within social media using the most popular data mining techniques and applications related to social media sites.
Credit card companies must be able to identify fraudulent credit card transactions so that clients are not charged for items they did not purchase. Previously, many machine learning approaches and classifiers were used to detect fraudulent transactions. However, because fraud patterns are always changing, it is becoming increasingly vital to investigate new frauds and develop the model based on the new patterns. The purpose of this research is to create a machine learning classifier that not only detects fraud but also detects legitimate transactions. As a result, the model should have excellent accuracy, precision, recall, and f1-score. As a result, we began with a large dataset in this study and used four machine learning classifiers: Support Vector Machine (SVM), Decision Tree, Naïve Bayes, and Random Forest. The random forest classifier scored 99.96% overall accuracy with the best precision, recall, f1-score, and Matthews correlation coefficient in the experiments.
Ananya SarkerMust. Asma YasminMd. Atikur RahmanMd. Harun Or RashidBristi Rani Roy