A design of beam-steerable antenna array for the future cellular communication (5G) is presented. The proposed design contains eight elements of compact end-fire antennas arranged on the top edge of smartphone printed circuit board (PCB). Configuration of the antenna element consists of the conductive patterns on the top and bottom copper foil layers and a substrate layer with a via-hole. The simulated results including input-impedance and also fundamental radiation properties have been presented and discussed. The impedance bandwidth (S11 ≤ -10 dB) of the antenna spans from 17.5 to 21 GHz (more than 3 GHz bandwidth) with a resonance at 19 GHz. The antenna exhibits end-fire (directional) radiation beams with wide-angle scanning property and could be used for the future 5G beam-forming. Furthermore, the characteristics of the array design in the vicinity of user-hand are studied.
In this paper, a multiple-input/multiple-output (MIMO) antenna design with polarization and radiation pattern diversity is presented for future smartphones. The configuration of the design consists of four double-fed circular-ring antenna elements located at different edges of the printed circuit board (PCB) with an FR-4 substrate and overall dimension of 75×150 mm2. The antenna elements are fed by 50-Ohm microstrip-lines and provide polarization and radiation pattern diversity function due to the orthogonal placement of their feed lines. A good impedance bandwidth (S11 ≤ -10 dB) of 3.4-3.8 GHz has been obtained for the smartphone antenna array. However, for S11 ≤ -6 dB, this value is 3.25-3.95 GHz. More than 3 dB realized gain and 80% total efficiency are achieved for the single-element radiator. The presented design not only provides the required radiation coverage but also generates the polarization diversity characteristic.
Constant potometer is an important component of pipeline anti-corrosion systems in the chemical industry. Based on Internet of Things (IoT) technology, Programmable Logic Controller (PLC) technology and database technology, this paper developed a set of a constant potometer remote monitoring management system. The remote monitoring and remote adjustment of the working status of the constant potometer are realized. The system has real-time data display, historical data query, alarm push management, user permission management, and supporting Web access and mobile client application (APP) access. The actual engineering project test results show the stability of the system, which can be widely used in cathodic protection systems.
A compact design of multi-sector patch antenna array for 60 GHz applications is presented and discussed in details. The proposed design combines five 1x8 linear patch antenna arrays, referred to as sectors, in a multi-sector configuration. The coaxial-fed radiation elements of the multi-sector array are designed on 0.2 mm Rogers RT5880 dielectrics. The array operates in the frequency range of 58-62 GHz and provides switchable directional/omnidirectional radiation beams with high gain and high directivity characteristics. The designed multi-sector array exhibits good performances and could be used in the fifth generation (5G) cellular networks.
In today's China, the well-educated middle class, with stable jobs and above-average income, are the driving force behind its Internet society. Through the analysis of data from the 2015 Chinese General Social Survey and 50 interviewees, this study investigates the current situation of this group’s specific internet usage. The findings of this study demonstrate that daily life among the members of this socioeconomic group is closely tied to the Internet. For Chinese middle class, the Internet is used to socialize and entertain self and others. It is also used to search for and share information as well as to build their identities. The empirical results of this study will provide a reference, supported by factual data, for enterprises seeking to target the Chinese middle class through online marketing efforts.
The diversity and complexity of modern IT systems make it almost impossible for internal teams to find vulnerabilities in all software before the software is officially released. The emergence of threat intelligence and vulnerability reporting policy has greatly reduced the burden on software vendors and organizations to find vulnerabilities. However, to prove the existence of the reported vulnerability, it is necessary but difficult for security incident response team to build a deliberated vulnerable environment from the vulnerability report with limited and incomplete information. This paper presents a structured, standardized, machine-oriented vulnerability intelligence format, that can be used to automate the orchestration of Deliberated Vulnerable Environment (DVE). This paper highlights the important role of software configuration and proof of vulnerable specifications in vulnerability intelligence, and proposes a triad model, which is called DIR (Dependency Configuration, Installation Configuration, Runtime Configuration), to define software configuration. Finally, this paper has also implemented a prototype system to demonstrate that the orchestration of DVE can be automated with the intelligence.
Technology is a recent prodigy in people’s everyday life that has taken off. It infiltrated almost every aspect of one’s lives, changing how people work, how people learn and how people perceive things. Academic Institutions, just like other organizations, have deeply modified its strategies to integrate technology into the institutional vision and corporate strategy that has never been greater. Information and Communications Technology (ICT) continues to be recognized as a major factor in organizations realizing its aims and objectives. Consequently, ICT has an important role in the mobilization of an academic institution’s strategy to support the delivery of operational, strategic or transformational objectives. This ICT strategy should align the institution with the radical changes of the ICT world through the use of Enterprise Architecture (EA). Hence, EA’s objective is to optimize the islands of legacy processes to be integrated that is receptive to change and supportive of the delivery of the strategy. In this paper, the focus is to explore the motivational antecedents during the adoption of EA in a Higher Education Institution in the Philippines for its ICT strategic plan. The seven antecedents (viewpoint, stakeholders, human traits, vision, revolutionary innovation, techniques and change components) provide understanding into EA adoption and the antecedents that influences the process of EA adoption.
In this work, a training algorithm for probabilistic neural networks (PNN) is presented. The algorithm addresses one of the major drawbacks of PNN, which is the size of the hidden layer in the network. By using a cross-validation training algorithm, the number of hidden neurons is shrunk to a smaller number consisting of the most representative samples of the training set. This is done without affecting the overall architecture of the network. Performance of the network is compared against performance of standard PNN for different databases from the UCI database repository. Results show an important gain in network size and performance.
This paper presents a web application for the improvement of images through recognition. The web application is based on the analysis of picture-based recognition methods that allow an improvement on the physical appearance of people posting in social networks. The basis relies on the study of tools that can correct or improve some features of the face, with the help of a wide collection of user images taken as reference to build a facial profile. Automatic facial profiling can be achieved with a deeper study of the Object Detection Library. It was possible to improve the initial images with the help of MATLAB and its filtering functions. The user can have a direct interaction with the program and manually adjust his preferences.
The proliferation of mobile devices in society enables the media to disseminate information and knowledge more rapidly. Higher education students access these contents and share them with each other, in the most diverse platforms, allowing the ubiquity in access to information. This article presents the results and respective quantitative analysis of a survey applied to communication students of two higher education institutions: one in Portugal and another in Spain. The results show that, in this sample, higher education students regularly access news content believing traditional news sources to be more credible. Regarding online sources, it was verified that the access was mostly to free news contents. This study intends to promote the knowledge about the changes that occur in the relationship of higher education students with the media, characterizing how news consumption is processed by these students, considering the resulting effects of the digital media evolution. It is intended to present not only the news sources they use, but also to know some of their habits and relationship with the news media.
The Internet of Things (IoT) will lead to the development of advanced Smart Home services that are pervasive, cost-effective, and can be accessed by home occupants from anywhere and at any time. However, advanced smart home applications will introduce grand security challenges due to the increase in the attack surface. Current approaches do not handle cybersecurity from a holistic point of view; hence, a systematic cybersecurity mechanism needs to be adopted when designing smart home applications. In this paper, we present a generic intrusion detection methodology to detect and mitigate the anomaly behaviors happened in Smart Home Systems (SHS). By utilizing our Smart Home Context Data Structure, the heterogeneous information and services acquired from SHS are mapped in context attributes which can describe the context of smart home operation precisely and accurately. Runtime models for describing usage patterns of home assets are developed based on characterization functions. A threat-aware action management methodology, used to efficiently mitigate anomaly behaviors, is proposed at the end. Our preliminary experimental results show that our methodology can be used to detect and mitigate known and unknown threats, as well as to protect SHS premises and services.
In response to the dearth of information about emoji use for different purposes in different settings, this paper investigates the paralinguistic function of emojis within Twitter communication in the United States. To conduct this investigation, the Twitter feeds from 16 population centers spread throughout the United States were collected from the Twitter public API. One hundred tweets were collected from each population center, totaling to 1,600 tweets. Tweets containing emojis were next extracted using the “emot” Python package; these were then analyzed via the IBM Watson API Natural Language Understanding module to identify the topics discussed. A manual content analysis was then conducted to ascertain the paralinguistic and emotional features of the emojis used in these tweets. We present our characterization of emoji usage in Twitter and discuss implications for the design of Twitter and other text-based communication tools.
With the rapid development of modern communication, diagnosing the fiber-optic quality and faults in real-time is widely focused. In this paper, a Labview-based system is proposed for fiber-optic faults detection. The wavelet threshold denoising method combined with Empirical Mode Decomposition (EMD) is applied to denoise the optical time domain reflectometer (OTDR) signal. Then the method based on Gabor representation is used to detect events. Experimental measurements show that signal to noise ratio (SNR) of the OTDR signal is improved by 1.34dB on average, compared with using the wavelet threshold denosing method. The proposed system has a high score in event detection capability and accuracy. The maximum detectable fiber length of the proposed Labview-based system can be 65km.
In this paper, we conduct a systematic survey of urban communities in Lithuania to evaluate their potential to co-create collective intelligence or “civic intelligence” applying Digital Co-creation Index methodology that includes different socio-technological indicators. Civic intelligence is a form of collective intelligence that refers to the group’s capacity to perceive societal problems and to address them effectively. The research focuses on evaluation of diverse organizational designs that increase efficient collective performance. The current scientific project advanced the state of the art by evaluating the basic preconditions in the urban communities through which the collective intelligence is being co-created under the systemic manner. The research subject is the “bottom up” digital enabled urban platforms, initiated by Lithuanian public organizations, civic movements or business entities. The web-based monitoring results obtained by applying a social indices calculation methodology and Pearson correlation analysis provided the information about the potential and limits of the urban communities and what possible changes need to be implemented to overcome the limitations.
In the current age of technology, information is the most precious asset of a company. Today, companies have a large amount of data. As the data become larger, access to data for some particular information is becoming slower day by day. Faster data processing to shape it in the form of information is the biggest issue. The major problems in distributed databases are the efficiency of data distribution and response time of data distribution. The security of data distribution is also a big issue. For these problems, we proposed a strategy that can maximize the efficiency of data distribution and also increase its response time. This technique gives better results for secure data distribution from multiple heterogeneous sources. The newly proposed technique facilitates the companies for secure data sharing efficiently and quickly.
Requirement Engineering (RE) is a part being created for programming structure during the software development lifecycle. Software product line development is a new topic area within the domain of software engineering. It also plays important role in decision making and it is ultimately helpful in rising business environment for productive programming headway. Decisions are central to engineering processes and they hold them together. It is argued that better decisions will lead to better engineering. To achieve better decisions requires that they are understood in detail. In order to address the issues, companies are moving towards Software Product Line Engineering (SPLE) which helps in providing large varieties of products with minimum development effort and cost. This paper proposed a new framework for software product line and compared with other models. The results can help to understand the needs in SPL testing, by identifying points that still require additional investigation. In our future scenario, we will combine this model in a controlled environment with industrial SPL projects which will be the new horizon for SPL process management testing strategies.
Check-in locations on social media provide information about an individual’s location. The millions of units of data generated from these sites provide knowledge for human activity. In this research, we used a geolocation service and users’ texts posted on Twitter social media to analyze human mobility. Our research will answer the questions; what are the movement patterns of a citizen? And, how far do people travel in the city? We explore the people trajectory of 201,118 check-ins and 22,318 users over a period of one month in Makassar city, Indonesia. To accommodate individual mobility, the authors only analyze the users with check-in activity greater than 30 times. We used sampling method with a systematic sampling approach to assign the research sample. The study found that the individual movement shows a high degree of regularity and intensity in certain places. The other finding found that the average distance an urban inhabitant can travel per day is as far as 9.6 km.
With the Internet becoming the dominant channel for business and life, many IPs are increasingly masked using web proxies for illegal purposes such as propagating malware, impersonate phishing pages to steal sensitive data or redirect victims to other malicious targets. Moreover, as Internet traffic continues to grow in size and complexity, it has become an increasingly challenging task to detect the proxy service due to their dynamic update and high anonymity. In this paper, we present an approach based on behavioral graph analysis to study the behavior similarity of web proxy users. Specifically, we use bipartite graphs to model host communications from network traffic and build one-mode projections of bipartite graphs for discovering social-behavior similarity of web proxy users. Based on the similarity matrices of end-users from the derived one-mode projection graphs, we apply a simple yet effective spectral clustering algorithm to discover the inherent web proxy users behavior clusters. The web proxy URL may vary from time to time. Still, the inherent interest would not. So, based on the intuition, by dint of our private tools implemented by WebDriver, we examine whether the top URLs visited by the web proxy users are web proxies. Our experiment results based on real datasets show that the behavior clusters not only reduce the number of URLs analysis but also provide an effective way to detect the web proxies, especially for the unknown web proxies.
In the digital era, in which we have an avalanche of information, it is not new that the Internet has brought new modes of communication and knowledge access. Characterized by the multiplicity of discourses, opinions, beliefs and cultures, the web is a space of political-ideological dimensions where people (who often do not know each other) interact and create representations, deconstruct stereotypes, and redefine identities. Currently, the translator needs to be able to deal with digital spaces ranging from specific software to social media, which inevitably impact on his professional life. One of the most impactful ways of being seen in cyberspace is the participation in social networking groups. In addition to its ability to disseminate information among participants, social networking groups allow a significant personal and social exposure. Such exposure is due to the visibility of each participant achieved not only on its personal profile page, but also in each comment or post the person makes in the groups. The objective of this paper is to study the representations of translators and translation process on the Internet, more specifically in publications in two Brazilian groups of great influence on the Facebook: "Translators/Interpreters" and "Translators, Interpreters and Curious". These chosen groups represent the changes the network has brought to the profession, including the way translators are seen and see themselves. The analyzed posts allowed a reading of what common sense seems to think about the translator as opposed to what the translators seem to think about themselves as a professional class. The results of the analysis lead to the conclusion that these two positions are antagonistic and sometimes represent conflict of interests: on the one hand, the society in general consider the translator’s work something easy, therefore it is not necessary to be well remunerated; on the other hand, the translators who know how complex a translation process is and how much it takes to be a good professional. The results also reveal that social networking sites such as Facebook provide more visibility, but it takes a more active role from the translator to achieve a greater appreciation of the profession and more recognition of the role of the translator, especially in face of increasingly development of automatic translation programs.
In this paper, we perform the investigation of some routing protocols in Vehicular Ad-Hoc Network (VANET) context. Indeed, we study the efficiency of protocols like Dynamic Source Routing (DSR), Ad hoc On-demand Distance Vector Routing (AODV), Destination Sequenced Distance Vector (DSDV), Optimized Link State Routing convention (OLSR) and Vehicular Multi-hop algorithm for Stable Clustering (VMASC) in terms of packet delivery ratio (PDR) and throughput. The performance evaluation and comparison between the studied protocols shows that the VMASC is the best protocols regarding fast data transmission and link stability in VANETs. The validation of all results is done by the NS3 simulator.
This paper illustrates an application of granular computing approach, namely rough set theory in data mining. The paper outlines the formalism of granular computing and elucidates the mathematical underpinning of rough set theory, which has been widely used by the data mining and the machine learning community. A real-world application is illustrated, and the classification performance is compared with other contending machine learning algorithms. The predictive performance of the rough set rule induction model shows comparative success with respect to other contending algorithms.
Big data applications have become an imperative for many fields. Many researchers have been devoted into increasing correct rates and reducing time complexities. Hence, the study designs and proposes an Ontology-based backpropagation neural network classification and reasoning strategy for NoSQL big data applications, which is called ON4NoSQL. ON4NoSQL is responsible for enhancing the performances of classifications in NoSQL and SQL databases to build up mass behavior models. Mass behavior models are made by MapReduce techniques and Hadoop distributed file system based on Hadoop service platform. The reference engine of ON4NoSQL is the ontology-based backpropagation neural network classification and reasoning strategy. Simulation results indicate that ON4NoSQL can efficiently achieve to construct a high performance environment for data storing, searching, and retrieving.