27 research outputs found
An analysis of coastal temperate old forest residual carbon, structure and understory plant floristics after wildfire
I assessed some coastal temperature old-growth forests of southwestern British Columbia to understand their post-wildfire structure, carbon storage and biodiversity values. I used a remotely sensed relativized burn ratio and a composite burn index to compare measures of aboveground carbon, structural complexity and floristic diversity between burned and unburned reference plots years after four large wildfires. The unburned reference plots represented the natural range of variation of old growth values. In burned plots, 21 of 60 retained carbon values and 10 plots retained structural values similar to unburned old growth plots. There was an average of 12% floristic similarity between burned and unburned understory plant communities. For land managers, this method offers a way to compare old-growth values after wildfire in order to prioritize protection, salvage, and restoration options
A General Model for the Design of Efficient Sign-Coding Tools for Wavelet-Based Encoders
[EN] Traditionally, it has been assumed that the compression of the sign of wavelet coefficients is not worth the effort because they form a zero-mean process. However, several image encoders such as JPEG 2000 include sign-coding capabilities. In this paper, we analyze the convenience of including sign-coding techniques into wavelet-based image encoders and propose a methodology that allows the design of sign-prediction tools for whatever kind of wavelet-based encoder. The proposed methodology is based on the use of metaheuristic algorithms to find the best sign prediction with the most appropriate context distribution that maximizes the resulting sign-compression rate of a particular wavelet encoder. Following our proposal, we have designed and implemented a sign-coding module for the LTW wavelet encoder, to evaluate the benefits of the sign-coding tool provided by our proposed methodology. The experimental results show that sign compression can save up to 18.91% of bit-rate when enabling sign-coding capabilities. Also, we have observed two general behaviors when coding the sign of wavelet coefficients: (a) the best results are provided from moderate to high compression rates; and (b) the sign redundancy may be better exploited when working with high-textured images.This research was supported by the Spanish Ministry of Economy and Competitiveness under Grant RTI2018-098156-B-C54, co-financed by FEDER funds (MINECO/FEDER/UE).LĂłpez-Granado, OM.; MartĂnez-Rach, MO.; MartĂ-Campoy, A.; Cruz-Chávez, MA.; PĂ©rez Malumbres, M. (2020). A General Model for the Design of Efficient Sign-Coding Tools for Wavelet-Based Encoders. Electronics. 9(11):1-17. https://doi.org/10.3390/electronics9111899S117911Said, A., & Pearlman, W. A. (1996). A new, fast, and efficient image codec based on set partitioning in hierarchical trees. IEEE Transactions on Circuits and Systems for Video Technology, 6(3), 243-250. doi:10.1109/76.499834ISO/IEC 15444-1:2019. Information technology—JPEG 2000 Image Coding System—Part 1: Core Coding Systemhttps://www.iso.org/standard/78321.htmlTaubman, D. (2000). High performance scalable image compression with EBCOT. IEEE Transactions on Image Processing, 9(7), 1158-1170. doi:10.1109/83.847830Bilgin, A., Sementilli, P. J., & Marcellin, M. W. (1999). Progressive image coding using trellis coded quantization. IEEE Transactions on Image Processing, 8(11), 1638-1643. doi:10.1109/83.799891Oliver, J., & Malumbres, M. P. (2006). Low-Complexity Multiresolution Image Compression Using Wavelet Lower Trees. IEEE Transactions on Circuits and Systems for Video Technology, 16(11), 1437-1444. doi:10.1109/tcsvt.2006.883505Cho, Y., & Pearlman, W. A. (2007). Hierarchical Dynamic Range Coding of Wavelet Subbands for Fast and Efficient Image Decompression. IEEE Transactions on Image Processing, 16(8), 2005-2015. doi:10.1109/tip.2007.901247Deever, A. T., & Hemami, S. S. (2003). Efficient sign coding and estimation of zero-quantized coefficients in embedded wavelet image codecs. IEEE Transactions on Image Processing, 12(4), 420-430. doi:10.1109/tip.2003.811499Mallat, S., & Zhong, S. (1992). Characterization of signals from multiscale edges. IEEE Transactions on Pattern Analysis and Machine Intelligence, 14(7), 710-732. doi:10.1109/34.142909LĂłpez-Granado, O., Galiano, V., MartĂ, A., MigallĂłn, H., MartĂnez-Rach, M., Piñol, P., & Malumbres, M. P. (2013). Improving image compression through the use of evolutionary computing algorithms. Data Management and Security. doi:10.2495/data130041Kodak Lossless True Color Image Suitehttp://r0k.us/graphics/kodak/Rawzor—Lossless Compression Software for Camera Raw Imageshttp://imagecompression.info/test_images
Hybrid eager and lazy evaluation for efficient compilation of Haskell
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2002.Includes bibliographical references (p. 208-220).This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.The advantage of a non-strict, purely functional language such as Haskell lies in its clean equational semantics. However, lazy implementations of Haskell fall short: they cannot express tail recursion gracefully without annotation. We describe resource-bounded hybrid evaluation, a mixture of strict and lazy evaluation, and its realization in Eager Haskell. From the programmer's perspective, Eager Haskell is simply another implementation of Haskell with the same clean equational semantics. Iteration can be expressed using tail recursion, without the need to resort to program annotations. Under hybrid evaluation, computations are ordinarily executed in program order just as in a strict functional language. When particular stack, heap, or time bounds are exceeded, suspensions are generated for all outstanding computations. These suspensions are re-started in a demand-driven fashion from the root. The Eager Haskell compiler translates Ac, the compiler's intermediate representation, to efficient C code. We use an equational semantics for Ac to develop simple correctness proofs for program transformations, and connect actions in the run-time system to steps in the hybrid evaluation strategy.(cont.) The focus of compilation is efficiency in the common case of straight-line execution; the handling of non-strictness and suspension are left to the run-time system. Several additional contributions have resulted from the implementation of hybrid evaluation. Eager Haskell is the first eager compiler to use a call stack. Our generational garbage collector uses this stack as an additional predictor of object lifetime. Objects above a stack watermark are assumed to be likely to die; we avoid promoting them. Those below are likely to remain untouched and therefore are good candidates for promotion. To avoid eagerly evaluating error checks, they are compiled into special bottom thunks, which are treated specially by the run-time system. The compiler identifies error handling code using a mixture of strictness and type information. This information is also used to avoid inlining error handlers, and to enable aggressive program transformation in the presence of error handling.by Jan-Willem Maessen.Ph.D
Advances in Multi-User Scheduling and Turbo Equalization for Wireless MIMO Systems
Nach einer Einleitung behandelt Teil 2 Mehrbenutzer-Scheduling fĂĽr die
Abwärtsstrecke von drahtlosen MIMO Systemen mit einer Sendestation und
kanaladaptivem precoding: In jeder Zeit- oder Frequenzressource kann eine
andere Nutzergruppe gleichzeitig bedient werden, räumlich getrennt durch
unterschiedliche Antennengewichte. Nutzer mit korrelierten Kanälen sollten
nicht gleichzeitig bedient werden, da dies die räumliche Trennbarkeit
erschwert. Die Summenrate einer Nutzermenge hängt von den Antennengewichten
ab, die wiederum von der Nutzerauswahl abhängen. Zur Entkopplung des
Problems schlägt diese Arbeit Metriken vor basierend auf einer geschätzten
Rate mit ZF precoding. Diese lässt sich mit Hilfe von wiederholten
orthogonalen Projektionen abschätzen, wodurch die Berechnung von
Antennengewichten beim Scheduling entfällt. Die Ratenschätzung kann
basierend auf momentanen Kanalmessungen oder auf gemittelter Kanalkenntnis
berechnet werden und es können Datenraten- und Fairness-Kriterien
berĂĽcksichtig werden. Effiziente Suchalgorithmen werden vorgestellt, die
die gesamte Systembandbreite auf einmal bearbeiten können und zur
Komplexitätsreduktion die Lösung in Zeit- und Frequenz nachführen können.
Teil 3 zeigt wie mehrere Sendestationen koordiniertes Scheduling und
kooperative Signalverarbeitung einsetzen können. Mittels orthogonalen
Projektionen ist es möglich, Inter-Site Interferenz zu schätzen, ohne
Antennengewichte berechnen zu mĂĽssen. Durch ein Konzept virtueller Nutzer
kann der obige Scheduling-Ansatz auf mehrere Sendestationen und sogar
Relays mit SDMA erweitert werden. Auf den benötigten Signalisierungsaufwand
wird kurz eingegangen und eine Methode zur Schätzung der Summenrate eines
Systems ohne Koordination besprochen. Teil4 entwickelt Optimierungen fĂĽr
Turbo Entzerrer. Diese Nutzen Signalkorrelation als Quelle von Redundanz.
Trotzdem kann eine Kombination mit MIMO precoding sinnvoll sein, da bei
Annahme realistischer Fehler in der Kanalkenntnis am Sender keine optimale
Interferenzunterdrückung möglich ist. Mit Hilfe von EXIT Charts wird eine
neuartige Methode zur adaptiven Nutzung von a-priori-Information zwischen
Iterationen entwickelt, die die Konvergenz verbessert. Dabei wird gezeigt,
wie man semi-blinde Kanalschätzung im EXIT chart berücksichtigen kann.
In Computersimulationen werden alle Verfahren basierend auf
4G-Systemparametern ĂĽberprĂĽft.After an introduction, part 2 of this thesis deals with downlink multi-user
scheduling for wireless MIMO systems with one transmitting station
performing channel adaptive precoding:Different user subsets can be served
in each time or frequency resource by separating them in space with
different antenna weight vectors. Users with correlated channel matrices
should not be served jointly since correlation impairs the spatial
separability.The resulting sum rate for each user subset depends on the
precoding weights, which in turn depend on the user subset. This thesis
manages to decouple this problem by proposing a scheduling metric based on
the rate with ZF precoding such as BD, written with the help of orthogonal
projection matrices. It allows estimating rates without computing any
antenna weights by using a repeated projection approximation.This rate
estimate allows considering user rate requirements and fairness criteria
and can work with either instantaneous or long term averaged channel
knowledge.Search algorithms are presented to efficiently solve user
grouping or selection problems jointly for the entire system bandwidth
while being able to track the solution in time and frequency for complexity
reduction.
Part 3 shows how multiple transmitting stations can benefit from
cooperative scheduling or joint signal processing. An orthogonal projection
based estimate of the inter-site interference power, again without
computing any antenna weights, and a virtual user concept extends the
scheduling approach to cooperative base stations and finally included SDMA
half-duplex relays in the scheduling.Signalling overhead is discussed and a
method to estimate the sum rate without coordination.
Part 4 presents optimizations for Turbo Equalizers. There, correlation
between user signals can be exploited as a source of redundancy.
Nevertheless a combination with transmit precoding which aims at reducing
correlation can be beneficial when the channel knowledge at the transmitter
contains a realistic error, leading to increased correlation. A novel
method for adaptive re-use of a-priori information between is developed to
increase convergence by tracking the iterations online with EXIT charts.A
method is proposed to model semi-blind channel estimation updates in an
EXIT chart.
Computer simulations with 4G system parameters illustrate the methods using realistic channel models.Im Buchhandel erhältlich:
Advances in Multi-User Scheduling and Turbo Equalization for Wireless MIMO Systems / Fuchs-Lautensack,Martin
Ilmenau: ISLE, 2009,116 S.
ISBN 978-3-938843-43-
UNF in Review 1990
An annual report to the community published in 1991 by the University of North Florida and Adam Herbert. The report includes articles highlighting the University accomplishments from 1990. The 1990 class had the best SATs and GPAs in UNF history, Dr. Faiz Al-Rubaee, associate professor in mathematics was names the Council for Advancement and Support of Education (CASE) 1990 Florida Professor of the Year, Rich Matteson, professor of music at the University of North Florida, was inducted into the Jazz Hall of Fame in 1990. A Voices page includes a message from the president, a debate section and Candid Campus, Profiles of Dr. Leon M. Lessinger, Dr. Joseph Perry, Anthony Williams and Alumni Notes. Also includes News from Around Campus, Sports and Business. The donors to the University of North Florida Foundation are listed
1999 Annual reports of the town officers of Henniker, New Hampshire.
This is an annual report containing vital statistics for a town/city in the state of New Hampshire
Recommended from our members
Spatial characterization and analysis of forests in the Mount Bachelor volcanic chain, central Oregon
Forest spatial pattern is a primary interest of landscape ecology due to the relationships between spatial configuration of biotic components and ecological processes. The spatial pattern must be measured in meaningful ways so that relationships between forests and their environment can be analyzed. Aerial and satellite imageries provide ecologists a variety of scale choices at which the spatial information of forests can be
presented in levels ranging form individual trees to landscapes. The high lava plain of central Oregon is characterized by young lava flows of moderate relief interrupted by scattered cinder cones and lava buttes. The regosolic soils developed on pumice support open coniferous forests of nationwide significance. The relationship between forests and the harsh rocky land has not been analyzed, yet large portions of the forest have been logged at various intensities over the last 40-50 years. A
better understanding of the relationship between forests and environment is needed for
management of healthy forest ecosystems. It is the intent of this study to use remotely sensed data to measure the spatial variability of forest patterns across the lava landscape in Mount Bachelor volcanic chain, and to analyze relationships between forest structural attributes and environmental variables. First, I used aerial photographs to characterize tree point pattern and measure canopy crown closure and density. A step-wise digital approach based on spatial, spectral, and topographic characteristics of the photographic data was developed to measure forest spatial patterns. The method provides a fast and accurate, yet low cost way to characterize tree point pattern and measure canopy crown closure and density. Second, I used Landsat TM imagery to estimate leaf area index (LAI). A new approach using multiple regression analysis was developed to overcome the saturation problems of commonly used vegetation indices at high LAT ranges and improve the performance of Landsat TM data in estimating LAI. Finally, I conducted an analysis using the spatial data developed in this study and supplemented with that obtained from Deschutes Nation Forest. The study documents
methods of integrating multiple GIS data layers for spatial analysis and parameterizing relationships between forests and environment
Recommended from our members
Heuristic solution techniques for a spatial harvest scheduling problem involving wildlife habitat and timber income
Three heuristic techniques: simulated annealing (SA), tabu search (TS), and tabu search with strategic oscillation (TSSO), were used to schedule silvicultural activities designed to accelerate development of older forest structure at both stand and landscape scales over a 2450 acre forest located in northwestern Oregon. Goals for the forest over a 100-year planning horizon included reaching at least 500 acres of older forest structure with at least one contiguous 200-acre (or larger) block as soon as possible. The configuration and location, but not the amounts, of the older forest structure acres and the contiguous block were then free to move about the forest through time while best meeting the goal of producing a high, steady revenue flow over the entire planning horizon subject to restrictions on maximum clearcut patch size. The heuristic techniques were able to provide feasible tactical schedules fulfilling the strategic goals over the entire horizon in ways which traditional forest planning tools cannot. Of the three techniques examined, TSSO produced schedules with the best, most consistent objective function values. SA yielded a wider range of values which were always slightly worse but required only a fraction of the computing time. Straightforward TS produced relatively poor objective function values, most likely because of its inability to search the infeasible regions of the diverse solution space. Estimation of the globally optimal objective function value using Weibull distributions suggested that all TSSO solutions were within 1.8% of the optimum, the best being within .03%, while all SA solutions were within 7.6%, the best being within 1 .7%. However, 95% confidence intervals of the Weibull location parameter estimates for the SA and TSSO distributions did not overlap, despite the fact that both distributions of results failed to be rejected as fitting a Weibull distribution. This disparity again suggests that statistical inference by itself of global optima for heuristic results may be an inadequate means of assessing how "good" a heuristic is
Wildland fire management. Volume 1: Prevention methods and analysis
A systems engineering approach is reported for the problem of reducing the number and severity of California's wildlife fires. Prevention methodologies are reviewed and cost benefit models are developed for making preignition decisions
Exploring a Modelling Method with Semantic Link Network and Resource Space Model
To model the complex reality, it is necessary to develop a powerful semantic model. A rational approach is to integrate a relational view and a multi-dimensional view of reality. The Semantic Link Network (SLN) is a semantic model based on a relational view and the Resource Space Model (RSM) is a multi-dimensional view for managing, sharing and specifying versatile resources with a universal resource observation. The motivation of this research consists of four aspects: (1) verify the roles of Semantic Link Network and the Resource Space Model in effectively managing various types of resources, (2) demonstrate the advantages of the Resource Space Model and Semantic Link Network, (3) uncover the rules through applications, and (4) generalize a methodology for modelling complex reality and managing various resources. The main contribution of this work consists of the following aspects: 1. A new text summarization method is proposed by segmenting a document into clauses based on semantic discourse relations and ranking and extracting the informative clauses according to their relations and roles. The Resource Space Model benefits from using semantic link network, ranking techniques and language characteristics. Compared with other summarization approaches, the proposed approach based on semantic relations achieves a higher recall score. Three implications are obtained from this research. 2. An SLN-based model for recommending research collaboration is proposed by extracting a semantic link network of different types of semantic nodes and different types of semantic links from scientific publications. Experiments on three data sets of scientific publications show that the model achieves a good performance in predicting future collaborators. This research further unveils that different semantic links play different roles in representing texts. 3. A multi-dimensional method for managing software engineering processes is developed. Software engineering processes are mapped into multiple dimensions for supporting analysis, development and maintenance of software systems. It can be used to uniformly classify and manage software methods and models through multiple dimensions so that software systems can be developed with appropriate methods. Interfaces for visualizing Resource Space Model are developed to support the proposed method by keeping the consistency among interface, the structure of model and faceted navigation