Tuesday, October 29, 2019

Abuse in the Workplace- a case of Gender Discrimination Research Paper

Abuse in the Workplace- a case of Gender Discrimination - Research Paper Example The notion of discrimination is best described as favorable treatment towards individuals not on the basis of merit but on the basis of bias or other immoral attitude (Mill, 1963). In what follows, a critical approach towards gender discrimination is taken along with its implications for organizations. Cases of gender discrimination are a common sight in the workplace today. Not long ago a lawsuit was filed against the Boston Cigna HealthCare Company for charges of gender discrimination. According to the lawsuit, it is alleged that the company intentionally discriminated against Bretta Karp and other female employees by displaying unfavorable treatment towards them compared to their male counterparts in terms of promotion practices, pay increases and an uncongenial workplace environment (Chase & Reidy, 2011). The company, however, denied all charges by claiming that it is committed to diversity management and does not allow any such behavior that leads to discrimination of any sort ( Chase & Reidy, 2011). Discrimination, such as the above, adds to the woes of females in the workplace by lowering their morale and threatening their self-esteem. Research suggests that after having controlled for factors such as age, education, experience, skills and parental status, women still receive only 81% of the pay of men for similar nature of work (Ashkanasy, Wilderom, & Peterson, 2010).

Sunday, October 27, 2019

Behavior of Packet Counts for Network Intrusion Detection

Behavior of Packet Counts for Network Intrusion Detection Statistical Behavior of Packet Counts for Network Intrusion Detection Abstract— Intrusions and attacks have become a very serious problem in network world. This paper presents a statistical characterization of packet counts that can be used for network intrusion detection. The main idea is based on detecting any suspicious behavior in computer networks depending on the comparison between the correlation results of control and data planes in the presence and absence of attacks using histogram analysis. Signal processing tools such as median filtering, moving average filtering, and local variance estimators are exploited to help in developing network anomaly detection approaches. Therefore, detecting dissimilarity can indicate an abnormal behavior. Keywords— Anomaly detection, statistics, Network Intrusion Detection Systems (NIDS). I. INTRODUCTION NOWADAYS, the use of the Internet has become important and it increased considerably. Internet use has spread to daily work, business, education, entertainment and etc. Computer networks bring us a lot of benefits, such as computing and better performance, but they also bring risks. So, security systems have to be built to face those risks. One of those systems is the network intrusion detection system (NIDS), which is designed to alert the network administrators to the presence of an attack. Recently, intrusions are classified as serious Internet security threats due to the mass service disruption they result in, the unsafe use of the Internet, and the difficulty to defend against them [1]. Some attacks aim to consume large amount of resources to prevent legitimate users from receiving satisfactory performance. Network Intrusion Detection System is a tool to detect the attacks that attempt to compromise the availability, integrity or confidentiality of the network. It has been started to be used frequently as one component of an effective layered security model for an organization. This system monitors network traffic continuously for malicious activity, and raise alerts when they detect attacks. Existing intrusion detection systems can be classified into signature detection systems/ misuse and anomaly detection systems [2-3]. Signature detection systems rely on a database of a predefined set of attack signatures. They detect attacks by comparing the observed patterns of the network traffic with the database. If the attack is listed in the database, then it can be successfully detected and identified [4]. On the other hand, anomaly detection systems are designed to compare the parameters of the normal network traffic to the observed unusual traffic [5]. In such cases, the detected deviation from the normal traffic is declared as an attack. Such methods can detect new kinds of network attacks. In this paper, we aim to studding the intrusion and attacks behavior by monitoring the changes in the traffic of the network. Detecting dissimilarity between the correlation results of control and data planes can indicate an abnormal behavior [6]. This paper is organized as follows. Section II includes the anomaly detection techniques. Section III, includes the suggested statistical analysis. Section IV, includes the simulation results. Section V includes the concluding remarks. II. Anomaly detection techniques A number of studies have focused on developing network anomaly detection methods. For example, Haystack [7] is one of the statistical anomaly-based intrusion detection systems. In this system, a range of values is set to indicate the normal status of each pre-defined feature. If the values measured during a session lie outside the normal range, then the score of a subject is raised. Haystack was designed to work offline and that was considered as one of its drawbacks [8]. Statistical Packet Anomaly Detection Engine (SPADE) [9] is also one of the statistical anomaly-based intrusion detection systems. It uses the concept of an anomaly score to detect sport scans. A simple frequency domain based approach is used to calculate the anomaly score of a packet. The fewer the packets, the higher the anomaly score. One drawback of the SPADE is its high false alarm rate. In this paper, we concentrate on the statistical analysis of the correlation sequence between packet and control counts in computer networks [10]. The suggested approach is based on distinguishing histograms of the correlation sequences of normal and abnormal traffics. The correlation sequences are processed either directly or after pre-processing with differentiator, median filtering, or local variance estimation. III. Statistics Histogram Analysis Histogram is defined as a graphical representation of the distribution of data, a histogram is a function that counts the number of observations that fall into each of the disjoint categories, Thus, if we let k be the total number of bins and n be the total number of observations, the histogram mi meets the following conditions [7]: (1) Median Filtering The median filtering is based on sorting the data and selecting is the middle number. It is used to exclude impulsive values in the correlation sequences. Mean The mean is the average of a set of numbers (2) Variance The variance is a measure of how items are dispersed about their mean. The variance of a whole population is given by the equation [11] (3) where M is the local mean. IV. Proposed Approach The proposed approach can be summarized in the following steps: Network traffic packet traces are typically provided in raw tcpdump format [12]. Therefore, it is necessary to preprocess packets to extract the features in the format needed to carry out further analysis [6]. Extracting a count features, from the packet header information . Computing the similarity between the two traffic groups; control and data by using cross-correlation function. Applying some sort of pre-processing on the correlation sequence with median filtering, moving average, differentiator, and local variance estimation. Histogram estimation of the original correlation sequences and the pre-processed sequences. Creating databases for the histograms with attacks and without attacks. Setting thresholds based on these histograms for discrimination. V. experimental results We have used the cross-correlation results between the control and data packets when there is no attacks and when there is an attack for one day of KSU traffic. Fig. 1 shows the correlation coefficients between the control and data packets when there is no an attack. Fig 2 shows the correlation coefficients when there is an attack applied. Fig. 3 shows the correlation coefficients histogram distribution for normal and abnormal traffic. Fig. 4 shows the histogram distribution of the correlation coefficient median for normal and abnormal traffic. Fig. 5 shows the histogram distribution of correlation coefficients mean for normal and abnormal traffic. Fig. 6 shows the histogram distribution of the correlation coefficients local variance for normal and abnormal traffic. The experimental results reveal that when there is an attack, a noticeable difference in histogram distribution is found. Fig. 1 : Correlation coefficients for normal traffic. Fig. 2 : Correlation coefficients for abnormal traffic. Fig. 3 : Correlation coefficients histogram distribution for normal and abnormal traffic. Fig. 4 : Histogram of the correlation coefficients median for normal and abnormal traffic. Fig. 5 : Histogram of the correlation coefficients local mean for normal and abnormal traffic. Fig. 6 : Histogram of the correlation coefficients local variance for normal and abnormal traffic. From these figures, we can set a probability threshold for each case, based on which, a decision of normal or abnormal traffic can be taken. VI. Conclusion The paper presented a statistical study for the correlation coefficients between packet and control planes of network traffic. Simulation experiments have shown that there is a difference in histogram distribution between normal and abnormal traffics. With the aid of signal processing tools like median filtering, local mean filtering and local variance filtering, we can set a group of thresholds to distinguish between normal and abnormal traffics.

Friday, October 25, 2019

Empirical Research Essay -- science

Empirical Research Empirical research is defined by the context of two separate types of study. Both methods are of value to the researcher in his/her quest for better understanding of the test subjects. However, correlational and experimental studies each has its own set of qualifications which allow for differences in subject and matter. Scientifically, some of these are useful, though others could be viewed as problematic. Correlational research is the process of studying the relationship between two variables. The examiner does not manipulate a relational study. Findings can either be positive, negative, or unrelated. Though scientific in the final statistical manner; the researcher uses his senses to observe and ultimately determine into which category a study falls. A positive correlation shows increases in both variables. Alternately, negative correlation looks at the increase in one variable, and the relation to the decrease in the other variable. There has to be an association between the two, or the result is unrelational. The scientific element to a correlational study is a measurable expression of degree defined as the correlational coefficient. It is a practical technique that gives a representation to the study. Numbers correspond with the level of correlation from a negative one demonstrating a perfect negative correlation, to positive one, showing a precise positive correlation. A zero on this graph would indicate no relation, or an unrelational cor...

Thursday, October 24, 2019

Kinds of research data Essay

The basic classification of research data is in terms of quantitative and qualitative methods. Quantitative research entails examination of numerical data by using mathematical models and statistical procedures. (Morgan, 2000). Qualitative research involves use of analysis and opinions to explain interviews and documents and understanding the phenomenon. The study requires a qualitative and some quantitative approach rather than a purely numerical one. Data Collection Approaches Data collection approaches can be primary and secondary. Both primary and secondary approaches will be utilized Extensive research will be carried out in the Library as well as on the World Wide Web. Qualitative Research Methodologies Qualitative methodology focuses on â€Å"quality† a term referring to the essence or ambience of something. Qualitative methods are used to understand complex social phenomena. In the present situation, literature survey along with interviews and questionnaires will be the key techniques used for data collection, collation and analysis. Case-Study Research Methodology. Case study in the primary research methodology employed in this study. A case study is an enquiry which uses multiple sources of evidence. It evaluates a contemporary phenomenon in real life context within the boundaries of the phenomenon and when the context is not clearly evident. Potter, (1996) has defined case study as a â€Å"realistic† methodology, which deals with solid and specific questions. Case study translates research objectives into more researchable problems, and provides rich examples, which are easy to comprehend. The significance of case studies is its revealing the meaning of a phenomenon. A peer group case study approach is considered best suited than other techniques to implement this research, as it will concentrate on an empirical, contemporary problem. It will also enable answering the questions as, â€Å"why† and, â€Å"how† to understand the underlying motivations. It will generate empirical data and interesting information specific to the phenomenon under study. Research cases offer a unique tool to testing theory by examining phenomena which are beyond the traditional statistical approaches. (Potter, 1996). Case study research is useful to the aims because the degree to which a case study produces valid and credible information is generally higher than qualitative research in a more general survey. An analysis of the Chinese and UK banking systems is proposed to be carried out with reference to the case study of Bank of China and HSBC, London. Limitations of Case Study One of the limits of case studies is the difficulty of generalizing their findings. If the case design is sound and if the researcher is careful to be explicit about the phenomenon and the context of the study, then results can be generalized.(Potter, 1996). Another limit of case studies is that they generate a lot of information that needs to be logically handled to strengthen the argument presented in order to develop credible conclusion. Research Instruments Primary data is collected for the research study to answer a specific question. Ways of collecting this kind of information includes surveys, observation or controlled experiments. Surveys are one of the most common ways to collect data, where the subject can be contacted through mail, telephone or directly in personal interviews. (Veal, 2000). It entails sending questionnaires, interviews and non-formal enquiries made to people. By carrying out interviews of a significant number of individuals a broad perspective can be provided. This research will collect data by essentially the method of interviews and questionnaires. Data Collection Method Questionnaire Survey Surveys are frequently used to obtain information about social issues. Jones (1997) has described that survey needs planned strategy to gather data. Surveys can be conducted in many ways, over the telephone, by mail or in person. Questionnaire surveys are ideal for providing complex information. Questionnaire involves a sample of the population which can range from few hundreds to few thousands depending on the research study (Veal, 2000). Questionnaire technique has been used in the study. Limitation of the questionnaires On an average the advantages outweigh the disadvantages, but a key drawback is low response rate. This will result in low level of confidence and distort the analogy of statistical information. Another weakness of questionnaire is the fact that it is a structured format and allows little room for flexibility with respect to reply by the respondents. Interview Interviews will form an interactive part of this study and will be carried out after the secondary research is done in combination with the questionnaire. When doing these interviews it will be important to keep a neutral view of the research topic, where the actual behavior instead of an intended behavior needs to be identified. Veal (2000) has described interview as a strategy to find out from people about the things, which cannot be observed directly.

Wednesday, October 23, 2019

Common Core and the effects on America

Language Learners Since being introduced in 2010, forty-four States have now adopted the Common Core State Standards Initiative. This initiative provides standards in English, Language Arts and Math. Every child in a public school will be expected to meet each standard for their grade level In these areas. These standards are designed to guarantee that every child In America will be on par with each other. They use the example In their 3 minute video about a child In Seattle who has an A In his English class but would be receiving a C in a public school in Chicago(understandingCommon Core). These standards are also designed to insure that American youth are graduating high school with the ability to compete with other nation's graduates worldwide. So not only do American school children need to compete with their national peers but they are also competing on a global level (Common Core). Professor E. D Hirsch In an excerpt from his book â€Å"On Cultural Literacy† discusses h ow cultural literacy is the only way for Impoverished children to rise above their lot In life.Part of his assumptions are that every American child needs a basic foundation, such as the subjects proposed by Common Core, to be provided by the American Education system. Hirsch argues this foundation will ensure every child is culturally literate,†only by accumulation of shared symbols, and the shared information that the symbols represent, can we learn to communicate effectively with one another in our national community† (36). This statement Is based off of the assumption that all Americans speak the same language (Bezel 661). This Is simply no longer true.Between 2010-2011 there were 4. 7 million English Language Learners (ELL) in the American School system (Fast Facts). Unlike No Child left behind act, Common Core acknowledges that they cannot define and meet every need that ELLS have in order to learn the language being used by Common Core on the same level as their p eers (English Standards 6). Without going in-depth into the standards, the most accessible Information on the website Is specific on how to accommodate special needs children, but for ELLS all the standards say Is â€Å"It Is possible for every child to meet the standards†(6).This paper will address the potential effects of Common Core on ELLS, by using the proposed effects of both Common Core supporters and Common Core posers. There is a lack of public information as to how these state standards are going to affect English Language Learners (ELL), there needs to be more focus on how state standards and federal tests are going to accommodate the growing number ELLS. In 2001 the Bush administration Implemented the No Child Left Behind act. (UNCLE). Under Title Ill in UNCLE it was clearly stated how ELLS were to receive special attention.It allowed for extended time on test and occasionally for the test to be administered in the learners own language. These methods were not eff ective cause ELLS were still testing twenty to thirty percent lower than their English proficient peers. UNCLE required schools to break their students Into subgroups one testing in comparison with their English proficient peers. For ELLS the test was not only used to gauge their progress in that particular subject but also their progress in English proficiency.Not surprisingly the ELLS were doing significantly worse on their English and language arts test than on Math and Science. A glaring fault of UNCLE was that once a child began to test on the same level as their English proficient peers hey were removed from the ELL subgroup and expected to keep succeeding without the extra accommodations. These ideas were implemented without there ever being proof they would work. Abide and Tilted finish their 2004 Winter report for CREST with the statement, â€Å"For a goal to be within reach of all schools, at least one school should have already attained it.To date we have yet to see a sc hool with a sizeable ELL population that meets the 2014 UNCLE requirements†. (5) In preparation for the continued rapid growth of the ELL population, which according to a TESTS brief is errantly approximated at 6 million, Common Core broke down English proficiency expectations into four categories (4). An independent study conducted by Stanford defined the four categories are reading, writing, speaking and listening, and language (3,5-7). Reading requirements are designed to make sure students can read and comprehend complex text across all subjects.Writing standards ensure students are prepared to research, analyze and argue. The need for speaking and listening is to ensure that every student has the ability to understand and articulate orally their ideas and arguments and the arguments of others. Finally the language requirement refers to grammar; in the paper the authors students need to understand that English is â€Å"as much a craft as a set of rules†(7). This pap er argues that according to the standards, language will tie the four brackets together (7).The opposition to Common Core is that Common Core will force teachers to teach to the test. Ritual standardized testing increases pressure on students to find the right answer instead of encouraging learning and independent thought (Hawkins). According to New York City teacher Katie Alphas in a letter to Carmen Farina, ELLS articulacy in grades third through eighth are â€Å"encumbered with standardized testing. † Here is a quote from a resignation letter by former Colorado Springs English teacher Pauline Hawkins â€Å"l am supposed to help them think for themselves,†¦ Instead, the emphasis is on Common Core Standards and high stakes testing that is creating a teach to the test mentality for our teachers, and stress and anxiety for our students. † This anxiety is increased when a student is not only being tested on their comprehension of the material but also their comprehe nsion of the English language. The majority of teachers who oppose Common Core want less regular testing and a more diversified way of evaluation. Common Core is moving fast. For most teachers they are being required to rewrite curriculum to match standardized testing, which is binge implemented in some states as early as April 2014.In particular for ELLS in grades third through eighth, who have been in the country for a year or less they are allowed one exemption from a test. After that they are required to test to the level of their English proficient peers. There is an extra time allotment for ELLS UT as Katie Lymphoma states in a blob post titled Battling the High-stakes Testing Beast: from NAPE to NYSE, â€Å"the state has generously offered to give you extended time (time and a half) on the tests; instead of 90 minutes per day for six days (3 days for testing day. That's a total of 13. 5 hours! Hours upon hours of testing for a fifth grader who is working twice as hard to com prehend the test does not seem like the right answer. Lymphoma in the same post discusses how she as their teacher does not have access to the test results only the scores of her students. To summarize searchers are required to conform curriculum to the tests. The same test where they are not allowed to see which material their students failed on. The ELLS are exempt from only one test in the entirety of their schooling. During the testing ELLS could spend up to twelve plus hours a week testing.According to the association of Teaching English to Speaker of Other Languages (TESTS) and an independent Stanford Study, Common Core will present significant challenges to ELLS but will also provide an education that will offer them the same opportunities as their native English speaking peers. When Common Core was first developed English Language Proficiency Development (ELOPED) Standards were left up to the individual states. Starting in 2012 Common Core recognized that a standard for ELOP ED would need to be developed. In 2012 the Council of Chief State Officers produced a basic framework for states to use when adapting their ELL standards.They also hired the Partnership for Assessment of Readiness for College and Careers and Smarter Balanced Assessment Consortium to prepare the assessment test prototypes. According to TESTS on testing â€Å"[the tests will] be administered by computer, both consortia are exploring technology-based accommodations, such as pop up glossaries and captions for audio. To ensure the widest accessibility to the test items† (7). By using measures such as extra glossaries and captions it should take some of the pressure off the students.Common Core firmly states that only by regular standardized tests can teachers and the rest of the education community truly understand what students are learning. The Framework for English Language Proficiency Development Standards corresponding to the Common Core State Standards and the Next Generatio n Science Standards is vividly clear in that they do not force schools to adopt a curriculum. Instead they simply provide what information students are expected to master at every grade. There is no proposed curriculum or any specific standards to guide teachers.The Framework along with Common Core has yet to address what will happen if a child can not meet the standards. UNCLE clearly defined that if a child failed, then the school would be required to use [article title] funding to provide the student the ability to travel to a school where they could get better education. Diana Rancidity in her article with the Washington Post poses the same question what will happen to students who fail. How much funding will go to provide tutoring? What will the repercussions be for teachers whose students are not meeting the standard? Will schools who consistently fail the standards be closed?These are Just some of the many unanswered questions raised by Common Core for ELLS. The Framework ans wered some questions, like which type of questions students are supposed to be able to answer by each grade, what type of thinking process they should have mastered and what kind of tests they should be able to pass. No one on either side of the issue is making light of the extra struggle the ELLS re going to have on top of the rigorous workload of an English proficient student. Those who are pro Common Core firmly state that by going through the Common workforce or further education.Whether that statement is true or not has yet to be seen. Only time and testing will prove whether Common Core is truly beneficial to ELLS. The facts state that there are approximately six million ELLS in the American public school system today. Until we have a working ELL program that has proven results it is counter productive to expect those students to perform well on the Common Core tests. The government is in the awkward middle ground of some of the information being released without enough inform ation to determine whether or not the program's success is even plausible.