28,346 research outputs found
Shaping the criminal justice system: the role of those supported by Criminal Justice Service
In Scotland, the development and delivery ofpersonalised social work services has been part of a wider public service reform agenda, building on Changing lives: report of the 21st century review of social work (Scottish Executive, 2006). This agenda has focused on harnessing the strengths, predilections, networks and capacities of those supported by services, to inform the design and delivery of services. To date, the place of criminal justice in this reform agenda has received comparatively limited attention (Weaver, 2011). This Insight focuses on the issue of involving those who have offended in shaping the criminal justice system, exploring the different models of involvement, the effectiveness of different approaches and the implications for Criminal Justice Social Work services
Discursive manoeuvres and hegemonic recuperations in New Zealand documentary representations of domestic violence
This paper examines three television documentaries--entitled Not Just a Domestic (1994), Not Just a Domestic: The Update (1994), and Picking Up the Pieces (1996)--that together formed part of the New Zealand police ‘Family Violence’ media campaign. Through a Foucauldian, feminist poststructuralist discourse analysis, the paper examines how these texts assert and privilege particular understandings of domestic violence, its causes, effects and possible solutions. The analysis illustrates the way in which five discursive explanations of domestic violence--those of medical pathology, romantic expressive tension, liberal humanist instrumentalism, tabula rasa learning and socio-systematic discourse--are articulated and hierarchically organised within these documentaries, and considers the potential hegemonic effects of each text’s discursive negotiations. It is argued that the centrality of personal ‘case studies’ and the testimonies of both battered women and formerly violent men work to privilege individualistic rather than socio-political explanations of domestic violence. Additionally, the inclusion of extensive ‘survivor speech’ means that women are frequently asked to explain and rationalize their actions as ‘victims’ of domestic violence, while fewer demands are placed on male perpetrators to account for their violent behaviour. Consequently, the documentaries leave the issue of male abuse of power largely unchallenged, and in this way ultimately affirm patriarchal hegemonic interests
A reliable multicast for XTP
Multicast services needed for current distributed applications on LAN's fall generally into one of three categories: datagram, semi-reliable, and reliable. Transport layer multicast datagrams represent unreliable service in which the transmitting context 'fires and forgets'. XTP executes these semantics when the MULTI and NOERR mode bits are both set. Distributing sensor data and other applications in which application-level error recovery strategies are appropriate benefit from the efficiency in multidestination delivery offered by datagram service. Semi-reliable service refers to multicasting in which the control algorithms of the transport layer--error, flow, and rate control--are used in transferring the multicast distribution to the set of receiving contexts, the multicast group. The multicast defined in XTP provides semi-reliable service. Since, under a semi-reliable service, joining a multicast group means listening on the group address and entails no coordination with other members, a semi-reliable facility can be used for communication between a client and a server group as well as true peer-to-peer group communication. Resource location in a LAN is an important application domain. The term 'semi-reliable' refers to the fact that group membership changes go undetected. No attempt is made to assess the current membership of the group at any time--before, during, or after--the data transfer
Issues in designing transport layer multicast facilities
Multicasting denotes a facility in a communications system for providing efficient delivery from a message's source to some well-defined set of locations using a single logical address. While modem network hardware supports multidestination delivery, first generation Transport Layer protocols (e.g., the DoD Transmission Control Protocol (TCP) (15) and ISO TP-4 (41)) did not anticipate the changes over the past decade in underlying network hardware, transmission speeds, and communication patterns that have enabled and driven the interest in reliable multicast. Much recent research has focused on integrating the underlying hardware multicast capability with the reliable services of Transport Layer protocols. Here, we explore the communication issues surrounding the design of such a reliable multicast mechanism. Approaches and solutions from the literature are discussed, and four experimental Transport Layer protocols that incorporate reliable multicast are examined
The Xpress Transfer Protocol (XTP): A tutorial (expanded version)
The Xpress Transfer Protocol (XTP) is a reliable, real-time, light weight transfer layer protocol. Current transport layer protocols such as DoD's Transmission Control Protocol (TCP) and ISO's Transport Protocol (TP) were not designed for the next generation of high speed, interconnected reliable networks such as fiber distributed data interface (FDDI) and the gigabit/second wide area networks. Unlike all previous transport layer protocols, XTP is being designed to be implemented in hardware as a VLSI chip set. By streamlining the protocol, combining the transport and network layers and utilizing the increased speed and parallelization possible with a VLSI implementation, XTP will be able to provide the end-to-end data transmission rates demanded in high speed networks without compromising reliability and functionality. This paper describes the operation of the XTP protocol and in particular, its error, flow and rate control; inter-networking addressing mechanisms; and multicast support features, as defined in the XTP Protocol Definition Revision 3.4
Experience with abstract notation one
The development of computer science has produced a vast number of machine architectures, programming languages, and compiler technologies. The cross product of these three characteristics defines the spectrum of previous and present data representation methodologies. With regard to computer networks, the uniqueness of these methodologies presents an obstacle when disparate host environments are to be interconnected. Interoperability within a heterogeneous network relies upon the establishment of data representation commonality. The International Standards Organization (ISO) is currently developing the abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively address this problem. When used within the presentation layer of the open systems interconnection reference model, these two standards provide the data representation commonality required to facilitate interoperability. The details of a compiler that was built to automate the use of ASN.1 and BER are described. From this experience, insights into both standards are given and potential problems relating to this development effort are discussed
THE TRANSMISSION OF PRICE VOLATILITY IN THE BEEF MARKETS
This paper reconsiders the implications of efficient markets for transmission of price volatility across markets. Tests of volatility transmission are based on conditional variances. Results are reported for key grain and beef markets. Transmission across cash, futures, and options is considered.Cointegration, GARCH, Market Efficiency, Beef Markets, Demand and Price Analysis, Livestock Production/Industries,
A comparison of refined models for flexible subassemblies
Interactions between structure response and control of large flexible space systems have challenged current modeling techniques and have prompted development of new techniques for model improvement. Due to the geometric complexity of envisioned large flexible space structures, finite element models (FEM's) will be used to predict the dynamic characteristics of structural components. It is widely accepted that these models must be experimentally 'validated' before their acceptance as the basis for final design analysis. However, predictions of modal properties (natural frequencies, mode shapes, and damping ratios) are often in error when compared to those obtained from Experimental Modal Analysis (EMA). Recent research efforts have resulted in the development of algorithmic approaches for model improvement, also referred to as system or structure identification
- …
