Electronic Theses and Dissertations - UTSA Access Only

Permanent URI for this collectionhttps://hdl.handle.net/20.500.12588/2227

This collection contains electronic UTSA theses and dissertations (ETDs), primarily from 2005 to present. The collection is not comprehensive; search the UTSA Library Catalog for a complete list of UTSA theses and dissertations.

These ETDs are available only to currently enrolled UTSA students, faculty or staff. To be able to download an ETD that is UTSA access only, navigate to “Log In” on the top right-hand corner of this screen, then select “Log in with my UTSA ID.”

Authors of these ETDs have retained their copyright while granting UTSA Libraries the non-exclusive right to reproduce and distribute their works.

Former students are invited to broaden access to their thesis or dissertation by making it available in the Open Access collection. To initiate this process, or if you have any questions about the ETD collection, please contact rrpress@utsa.edu.

Browse

Recent Submissions

Now showing 1 - 20 of 3370
  • Item
    Magnetic Reconnection on the Day Side of the Earth: Electron Acceleration and Electric Fields in the Electron Diffusion Region
    (2023) Pritchard, Kristina Renee
    This dissertation focuses on magnetic reconnection located on the day side of Earth's magnetosphere at the electron scale. Observations by the National Aeronautics and Space Administration's Magnetospheric Multiscale Mission are used to better define and understand various aspects of the electron diffusion region. The first study of this dissertation focuses on an event at a nonprimary electron diffusion region where the motion of the spacecraft pass through the X-line in a sequential fashion that is optimal for detailed multi-spacecraft evaluation. By utilizing velocity distribution functions and corresponding energy spectrum coupled with readings of the normal electric field (EN), acceleration of the magnetosheath electrons crossing the X-line into the magnetopause is quantified for the first time. The second study of the dissertation focuses on the reconnection electric field and normalized reconnection rates for fourteen reconnection events on the day side of the Earth's magnetosphere. Because the reconnection electric field component is much smaller than the other components, special care is taken to eliminate contamination by the larger components. This process is used for all 47 spacecraft that pass through electron diffusion regions in these events. We find that normalized reconnection rates vary greatly within very small temporal and spatial scales. No dependence between local or solar-wind parameters and the normalized reconnection rate is found, but there is a positive correlation between the unnormalized reconnection electric field and the solar-wind dynamic pressure in the magnetosheath. When summed over all spacecraft in these events, we find an average normalized reconnection rate of 0.15, which agrees with theoretical predictions. In addition to this manuscript, supplementary material in the form of an Excel spreadsheet will be included that provides additional data pertaining to Chapter 3. This data set provides a range of physical properties and calculations acquired at each spacecraft on the fourteen dayside events studied, that were not included in the main text.
  • Item
    Sensitivity Analysis in Structural Dynamics Using Hypercomplex Automatic Differentiation and Spectral Finite Elements
    (2023) Navarro, Juan David
    The dynamic behavior of structural systems is significantly influenced by variations in structural parameters and service conditions, making it crucial to understand their impact for ensuring structural integrity and safety. Sensitivity analysis provides a means to quantify the influence of parameter variations on the dynamic response of structures. This dissertation presents a novel methodology that combines the Hypercomplex Automatic Differentiation method (HYPAD) with the Spectral Finite Element Method (SFEM), known as the Hypercomplex Spectral Finite Element Method (HYPAD-SFEM). The objective is to compute highly accurate arbitrary-order sensitivities in structural dynamics, enabling a deeper understanding of how variations in structural parameters and service conditions impact the dynamic response of structures. The dissertation is structured as a compendium of four publications focused on three specific aims. Firstly, the influence of parameter variations in the free vibration of structures is investigated, where the natural modes and frequencies of vibration are analyzed using generalized eigenvalue problems. Secondly, the influence of parameter variations in the forced vibration of structures is examined by obtaining sensitivities of the frequency response function (FRF). Lastly, the influence of parameter variations in the transient dynamic response of structures is explored through explicit dynamic simulations. Subsequently, the methodology is used to assess the detection capabilities of waveguide Structural Health Monitoring (SHM), providing an efficient and accurate framework for estimating the probability of detection (POD) of SHM systems. Overall, this dissertation extends the application of HYPAD to complex-valued problems, eigenvalue problems, frequency-domain solutions, explicit time integration schemes, and spectral elements.
  • Item
    Memory Consolidation: How Reliable Are Our Memories?
    (2023) Lopez, Matthew Roberto
    In the first part of this thesis, I evaluate the contribution of sleep on memory representations of an object/place memory in young and old animals. I demonstrate that different subpopulations of cells code distinct aspects of the mnemonic experience and an acute session of sleep deprivation serves to improve cognitive performance in old mice, while producing impairments in young ones. Analysis of sleep patterns demonstrate that improved memory in old mice correlates with consolidated SWS, demonstrating that acute sleep deprivation has different effects in young and old mice. In the second part of this thesis, I evaluate the stability of cortical memory engrams, the changes associated with memory retrieval, and the reorganization of cortical networks over time. The results demonstrate that constitutively active neurons – neurons active across retrieval sessions -carry the emotional valence of learned cues, allowing discrimination of safe and fearful memory traces, whereas temporary active neurons generalize representations at the neural level.
  • Item
    User Privacy and Cyber Threats: Discourse Analysis of Cyber Threat Incidents
    (2023) Bhatt, Paras
    This Dissertation focuses on three major studies aimed at addressing user privacy and cyber threats by enabling a discourse analysis of different communities such as social media, cybersecurity practitioners, and dark web community using different techniques of natural language processing validated through annotations and theoretical frames based on Theory of Data Breach Harms and Routine Activity Theory. The upper block represents the cyber threat aspects focused on in this dissertation's essays using the background presented in the lower block. The work focuses on major cyber threats in current times such as data breach and ransomware attacks. Using theoretical and data driven approaches, we conduct a thorough analysis of literature frames that explain different views pertaining to cyber threat incidents and use natural language processing for evaluating discourse in different communities to enable generation of actionable cyber threat insights. Social network communities transmit public concerns and raise awareness about societal issues such as user privacy and cyber threats. Recently, the need for processing public discourse has become a critical research topic of interest. However, the growing volume of unstructured data makes it difficult to process all issues under a single umbrella, causing to overlook central topics of interest, such as privacy and cyber threats. To address challenges of rising data volumes and to evaluate the discourse on user privacy and cyber threat, particularly among users seeking greater cybersecurity protection, we conduct focused empirical analyses of social media, cybersecurity practitioners, and dark web community discourse about user privacy and cyber threat incidents.
  • Item
    A sublexical unit based hash model approach for spam detection
    (2009) Zhang, Like
    This research introduces an original anomaly detection approach based on a sublexical unit hash model for application level content. This approach is an advance over previous arbitrarily defined payload keyword and 1-gram frequency analysis approaches. Based on the split fovea theory in human recognition, this new approach uses a special hash function to identify groups of neighboring words. The hash frequency distribution is calculated to build the profile for a specific content type. Examples of utilizing the algorithm for detecting spam and phishing emails are illustrated in this dissertation. A brief review of network intrusion and anomaly detection will first be presented, followed by a discussion of recent research initiatives on application level anomaly detection. Previous research results for payload keyword and byte frequency based anomaly detection will also be presented. The drawback in using N-gram analysis, which has been applied in most related research efforts, is discussed at the end of chapter 2. The importance of text content analysis to application level anomaly detection will also be explained. After a background introduction of the split fovea theory in psychological research, the proposed sublexical unit hash frequency distribution based method will be presented. How human recognition theory is applied as the fundamental element for a proposed hashing algorithm will be examined followed by a demonstration of how the hashing algorithm is applied to anomaly detection. Spam email is used as the major example in this discussion. The reason spam and phishing emails are used in our experiments includes the availability of detailed experimental data and the possibility of conducting an in-depth analysis of the test data. An interesting comparison between the proposed algorithm and several popular commercial spam email filters used by Google and Yahoo is also presented. The outcome shows the benefits of the proposed approach. The last chapter provides a review of the research and explains how the previous payload keyword approach evolved into the hash model solution. The last chapter discusses the possibility of extending the hash model based anomaly detection to other areas including Unicode applications.
  • Item
    The influence of electronic word-of-mouth (eWOM) skepticism on perceptions toward message credibility and beneficiary organization
    (2016) Zhang, Xiao Jerry
    Electronic word-of-mouth (eWOM) has been cited as a significant factor influencing Internet users' perceptions in various situations, sectors and industries (Lee et al. 2009; Chatterjee 200; Dellarocas et al. 2007; Cox et al. 2008). However, as more evidence demonstrating the pervasive use of fake eWOM has been exposed (Forrest and Cao 2010; Malbon 2013), Internet users' confidence regarding the truthfulness and genuineness of eWOM may have been severely undermined. Different from most existing online trust and online information credibility literature, this research assumes that Internet users may have already developed a certain level of skepticism toward all eWOM messages (eWOM skepticism). In this study, our research focuses on how eWOM skepticism is influenced by personal and environmental factors, and how eWOM skepticism influences Internet users' message credibility assessment and their attitudes toward the organization that may get benefits from fake eWOM propaganda. To achieve this goal, first, we created the new measurement items for eWOM skepticism and validated them. Then, using control experiment, we collected the data, which were analyzed using Multivariate Analysis of Covariance (MANCOVA) and Partial Least Squares (PLS). The results revealed that dispositional trust, structural assurance and negative experience of eWOM significantly influence eWOM skepticism, and that eWOM skepticism is likely to influence Internet users' judgments and perceptions. Based on our model, we also found that the attributions Internet users made toward the eWOM messages are strong predictors for their attitude toward the eWOM messages and the potential beneficiary organization. This study emphasizes the importance of incorporating eWOM skepticism when investigating eWOM trust scenarios, and supports the argument about the coexistence of trust and distrust. Several theoretical and practical contributions are also discussed.
  • Item
    Management of shared resources in multi-threading / multi-core systems
    (2014) Zhang, Yilin
    Based on the traditional superscalar processors, Simultaneous Multi-Threading (SMT) offers an improved mechanism to enhance the overall performance by exploiting Thread-Level Parallelism (TLP) to overcome the limits of Instruction-Level Parallelism (ILP), and a multi-core system with multiple independent processors is capable in utilizing job-level parallelism by allowing multiple jobs to be processed currently. The most common characteristic of parallel systems is the sharing of key datapath components among multiple independent threads/jobs in order to better utilize the resources. In an SMT system, due to the various characteristics of each thread, the occupation of the shared resources can be very unbalanced. Our research is aiming to solve this problem and to make efficient resource allocation among threads. Our investigation shows that among the resources in an SMT system, physical register file, Issue Queue (IQ) and write buffer are the most critical resources that are shared among threads. There are several approaches proposed in this dissertation: Register File Allocation, Instruction Recalling, Speculative Trace Control, Autonomous IQ Usage Control, Write Buffer Capping and Integrated Shared Resources Control. To better utilize the physical register file, we limit the maximal number of physical registers that a thread is allowed to occupy at any time, so as to eliminate the overwhelming occupation caused by a single thread. Several techniques have been proposed in order to improve the utilization of IQ: (1) to reduce the IQ occupation of the inactive thread, we introduce Instruction Recalling to remove those long-latency instructions; (2) to reduce the wastes of resources caused by the wrong-way trace due to a branch miss prediction, we propose an algorithm to control the amount of speculative instructions from a thread to be dispatched and executed in the pipeline, the so-called Speculative Trace Control technique; and (3) to remove the environment dependency of a technique, we introduced Autonomous Control to adjust the IQ distribution based on the real-time performance output. The write buffer is another shared resource which is easily unfairly occupied. Write Buffer Capping is a technique which prevents any threads from overwhelmingly occupying the write buffer by setting a cap value on the maximal amount of write buffer entries that a thread is allowed to take. The Integrated Shared Resource Management takes the above factors into consideration and manages the usage of the most critical shared resources (physical register file, IQ and write buffer) simultaneously for each thread, providing even significant enhancement with relative small hardware investment. In a multi-core system, memory and the interconnection network are shared among processors and their performances are key to the overall throughput of the system. In the last chapter we further extend our analysis on the impact that different interconnection networks impose on the whole system's overall performance. We show that the tradeoff between latency and concurrent access capacity may become a critical deciding factor in choosing the correct size of network for applications with different memory traffic demands.
  • Item
    Privacy preservation in social graphs
    (2012) Zhang, Lijie
    Hundreds of millions of people use social network sites daily for entertainment, socialization, and business purposes. Social network sites have accumulated huge amount of personal information, which can be modeled by social graphs, where vertices represent persons, and edges represent relationships. Social graphs have attracted tremendous interest of scholars and application developers in many areas. However, varieties of sensitive personal information become a privacy concern for social network users and owners, and prohibit publishing the graphs for massive usage. Diverse privacy attacks cause privacy disclosure in social graphs. We categorize the privacy attacks into two categories --- vertex re-identification attacks and information re-association attacks. For vertex re-identification attacks, many early researches propose anonymization methods to protect the identity of vertices so that private information on vertices is preserved. Nevertheless, sensitive relationships represented by edges may still disclose. Our work focuses on design anonymization methods to protect the sensitive edges. We deal with three issues in the method design: privacy measure, utility loss measure and performance. Our contribution includes using a strong equivalence relation to define the privacy measure, choosing the number of edges changed as utility loss measure in a theoretic manner, and developing efficient methods based on a condensed graph for the large scale graph. For information re-association attacks, recent researches have designed attack models based on various techniques, such as statistics models, data mining techniques, or security attacks. We design a new information re-association attack that combines web search and information extraction techniques. Our contribution includes proposing a measurement to evaluate the effectiveness of the attack, and empirically analyzing the privacy disclosure under this attack.
  • Item
    Essays on state pension plans and trading in bankrupt stocks
    (2014) Zhang, Hongxian
    This dissertation consists of two essays on suboptimal behavior of financial markets. Essay I examines stocks of bankrupt firms after the court confirms they will receive nothing. While trading volume is negligible for most worthless stocks, some have sizable trading volume, indicating investor ignorance of their zero intrinsic value. Prices respond irrationally to news in several instances, and they are higher for more liquid worthless stocks, which are more likely to attract uninformed investors. Our analysis includes the first empirical examination of short-selling in bankrupt firms. Short-covering cannot account for the anomalous price and trading volume. Short-sellers are active in these stocks and play a useful role in pushing prices down toward intrinsic value. Essay II examines the effects of state corruption as well as political and governance factors on U.S. public pension funds. We find that pension funds in states with more corruption have lower performance; a one standard deviation increase in corruption is associated with a decrease in annual returns between 17 and 21 basis points, and this relation is robust to state-level and pension-level fixed effects. Pensions located in more corrupt jurisdictions also invest a larger fraction of their assets in equities. We find that having a new treasurer decreases the negative effects of corruption, suggesting that frequent changes in administrations are beneficial in corrupt jurisdictions. Governance-related variables and political affiliation variables are by themselves not significantly related to pension returns, although these variables are associated with differences in asset allocation.
  • Item
    Bayesian power prior analysis and its application to operational risk and Rasch model
    (2010) Zhang, Honglian
    When sample size is small, informative priors can be valuable in increasing the precision of estimates. Pooling historical data and current data with equal weights under the assumption that both of them are from the same population may be misleading when heterogeneity exists between historical data and current data. This is particularly true when the sample size of historical data is much larger than that of the current data. One way of constructing an informative prior in the presence of the historical data is the power prior, which is realized by raising the likelihood of the historical data to a fractional power. In this dissertation, we extend the power prior by considering the existence of nuisance parameters. When historical information is used as priors, we assume that the parameters of interest have not changed, while the nuisance parameter may change. The properties of power prior methods with nuisance parameters and its posterior distributions are examined for normal populations. The power prior approaches, with or without nuisance parameters, are compared empirically in terms of the mean squared error (MSE) of the estimated parameter of interest as well as the behavior of the power parameter. To illustrate the implementation of the power prior with nuisance parameter approach, we apply it to lognormal models for operational risk data and the Rasch model for item response theory (IRT). In the application to the Rasch model, we extend the power prior with nuisance parameter approach further by incorporating it with the hierarchical Bayes model.
  • Item
    Long-Range Gabaergic Projections in the Mouse Auditory Cortex
    (2020) Zurita Apellaniz, Hector
    The auditory cortex is a brain area where many complex auditory signals are processed to enable an organism to understand, learn, and react to the acoustic environment. How neurons connect within the cytoarchitecture of the auditory cortex are responsible for these processes to come to place. Understanding the diversity and interconnectivity of neurons in the auditory cortex and other brain areas advances the understanding of how the auditory system works. There are two types of neurons in the auditory cortex that work together to make the system work properly and efficiently. These are excitatory glutamatergic and inhibitory GABAergic neurons. Classically, glutamatergic neurons in the cortex are considered long-range projecting which transmit signals to other brain areas through excitation whereas GABAergic neurons only project locally and modulate the local neuronal network through inhibition. However, recent evidence shows that cortical GABAergic neurons can not only be local projecting neurons but also be long-range projecting and inhibit other brain areas. Using recent technological advances and classical techniques in neuroscience including transgenic mice, viral and non-viral neuronal tracers, in vitro slice electrophysiology, optogenetics, pharmacology, confocal microscopy, neuronal tracings, and immunohistochemistry allowed the discovery and study of these overlooked neurons. Cortical long-range GABAergic neurons add to the complexity of how cortical circuits processes and transmits neuronal signals throughout the brain. In this dissertation, I describe and explore the characteristics, connectivity, and function of novel long-range GABAergic neurons in the mouse auditory cortex.
  • Item
    Perceptions of rural Latina/o high school students: Addressing issues concerning higher education access
    (2013) Zimmerman, Renee Trevino
    This qualitative study compares and analyzes the social network experiences of six students. Three different sites were included in this research and two participants consented to participation per site. All students identify as Latin , are considered low-socioeconomic and are potential first generation college students. The findings of this study reveal the varying degrees of social capital each student received which bears on their greater educational trajectories.
  • Item
    Numerical simulation of the transient thermal response of porous rock surrounding a well extracting geothermal energy
    (2011) Zigtema, John
    Numerical simulations of the thermal response of porous rock surrounding a well extracting geothermal energy were performed. The rock surrounding the well was chosen as Limestone (Salem) and was homogenous throughout the surrounding control volume. The porosity of the rock ranged from 0.05 to 0.30, and the pores were assumed to be spherical in shape, evenly distributed throughout the control volume, and to be closed, i.e. fluid transport effects were negligible. The depth of the water table in the control volume ranged from 100 m to 4000 m below the surface, and pores above the water table were saturated only with air while pores at or below the depth of the water table were saturated with liquid water. The temperature profile of the well was held constant throughout the simulation, with a temperature difference from the initial rock temperature profile ranging from 5 K to 10 K. In the simulations the well had a diameter of 0.3 m, and extended 3000 m below the surface, and the control volume extended from the surface to 3750 m below the surface and from the well wall outward 25 m radially, and the simulations ran for a time span of 10 years. The simulation model was developed from the radial coordinate form of the heat equation and accounted for heat transfer radially and vertically, and the change in thermo-physical properties caused by rock porosity and the fluid contained within the pores. The simulations showed two distinct regions of change in the temperature profile of the rock, the region at the depth of the well and the region below the depth of the well. In the region below the well, there was no change in the temperature profile over time. These results indicate that heat transfer in the control volume took place almost exclusively in the radial direction, with notable temperature change extending radially from the well a relatively small distance. In the well region, two distinct sub-regions were found to exist when the depth of the water table was located in the well region, one region above the water table where the pores were saturated with air, and one below where the pores were saturated with water. Both regions showed a similar response, a transient temperature change starting at the well wall and travelling radially outward through the control volume over time, with the amplitude of the transient decreasing over time. The temperature change in the air filled region was found to be higher than the change in the water filled region before the transient, while temperature change in the water filled region was higher after the transient. Changes in the water table depth were found to only affect the location of the air filled and water filled regions, with the temperature response of these regions the same regardless of the water table depth. Increased porosity was found to cause decreased change in the temperature profile, with the effects more noticeable in the water filled region. The well temperature difference was found to cause proportional changes in the temperature profile change with increased well temperature difference resulting in increased temperature profile change. The heat transfer rate from the surrounding rock to the well was studied over time with initial heat transfer rates for a well temperature difference of 10 K of 120 to 190 kW decaying exponentially over time to steady-state heat transfer rates of 70 -- 40 kW. The heat transfer rate was found to be proportional to the well temperature difference and had no other affect on the heat transfer rate over time. Increased porosity was determined to decrease the heat transfer rate while increasing the time to reach steady-state, while increasing water table depth was found to decrease both the heat transfer rate and time to reach steady-state. The time to reach steady-state ranged from 5.7 years to 7.3 years, and the effect of porosity on time to steady-state was found to be greater at decreased water table depths, with water table depth having a larger affect than porosity. From the thermal response of the system, it was determined that the temperature change in the rock extended a relatively small distance from the well, allowing the wells to be located relatively close together. By operating these wells simultaneously, the geothermal energy extraction would be additive allowing for greater overall rates of energy extraction. Conversely, if the wells were operated sequentially, geothermal energy could be extracted for a greater period of time.
  • Item
    Improvement of Machine-to-Machine Reproducibility of Stereolithography (SLA) Printers Using Gaussian Processes and Bayesian Optimization
    (2022) Zilevicius, Danielius
    SLA has gained popularity due to its high-resolution and high degree of design freedom, however, variability in the quality of the product remains an issue. The lack of reproducibility from printer to printer is a barrier to the qualification and certification of machines in the AM industry. Uncertainty Quantification (UQ) has gained increasing attention within the AM industry to combat reproducibility issues. The most common method for applying UQ in AM is physical experiments, however, the number of experiments required to relate process parameters to process outputs makes the current method expensive and time consuming. This study focuses on the transfer of optimal printing parameters to different printers to optimize dimensional accuracy of printed parts and reduce the number of experiments required to certify an AM machine. The printing parameters used in the study were X-, Y-, Z-orientation, layer thickness, support density, support touchpoint size, cure temperature, and cure time. A Gaussian Process was used to fit the data and Bayesian Optimization was used determine the sets of printing parameters to be used during the DOE. Once the optimal parameters were found for printer 1, the same parameters were used on printer 2 and the Gaussian Process was used to determine additional printing parameters. While printer 2 performed better than printer 1, the algorithm showed improved dimensional accuracy from the initial prints to the optimal prints on both printers. The approach presented in this study is effective for reducing the number of experiments required to certify SLA machines and can be adopted for other AM processes.
  • Item
    Development of a total atherosclerotic occlusion with cell-mediated calcium in animal arteries using tissue engineered scaffolds
    (2010) Zhu, Beili
    The most risky atherosclerosis includes the presence of calcium deposits in lumen of totally occluded arteries. Most animal models of total atherosclerotic occlusion do not mimic gradual occlusion process of the vessel and is often missing physiological calcium deposits in occlusion sites. The aim of this dissertation is to create chronic atherosclerotic occlusion that contains calcium in animal arteries using tissue engineered scaffolds. The overall project is composed of the following four aspects of work: (1) calcification ability of human aortic smooth muscle cells (HASMCs) on polylactic acid (PLA) films, (2) calcification of human primary osteoblast (HOB) cultures on polycaprolactone (PCL) scaffolds with TGF-â1 loading, (3) the effect of flow on the calcification of HOB/PCL construct, and (4) establishment of a total atherosclerotic occlusion in an animal artery by the implantation of HOB/PCL construct. (1) HASMCs were cultured on PLA films with three calcification induction techniques (BMP-2, Ca/Pi, and beta-GP treatments). Calcium staining, BMP-2 production of cells, and pH condition in media has been investigated. BMP-2 treatment did not show calcification in HASMC cultures. HASMC cultures are capable of depositing calcium through Ca/Pi stimulation, but not through beta-GP treatment. PLA films may hinder mineralization of cell cultures. Gas plasma treatment had no effect on cell culture calcification. (2) HOBs were first treated with TGF-beta1 and dexamethasone (Dex) in media on bare wells. Calcification was visualized under light microscope. Next, HOBs were cultured on PCL scaffolds with TGF-beta1 loading in the presence or absence of Dex in media. DNA content, ALP activity, amount and distribution of calcium were examined. On bare wells, highest calcification was observed in groups with both TGF-beta1 (0.02 ng/ml) and Dex (10 -10 M) in media. When cultured on scaffolds in Dex supplemented media, TGF-beta1 appeared to have an inhibitory effect on scaffold calcification. When HOBs were cultured without Dex in media, the lower amount of TGF-beta1 loading (5 ng) showed the most calcification, high DNA synthesis, and high ALP activity on scaffolds. (3) HOBs were cultured on PCL scaffolds with TGF-beta1 loading under dynamic flows. DNA content, amount of calcium, visualization of calcium on scaffolds, and percentage weight loss of PCL scaffolds was tested. Dynamic flow improves the intensity and enlarges the distribution of calcium on polycaprolactone scaffolds. Calcification in HOB cultures can occur early under flow conditions. (4) HOB/PCL constructs were implanted into rabbit femoral arteries after 28 days of in vitro initiation of calcification treatment. Angiograms, gross histology of arteries were captured to examine the occlusion of arteries. Fluorescent stain of calcium and EDS detection of calcium element was performed. Femoral arteries stayed totally occluded over 28 days. Physiologically deposited calcium was observed in chronic total occlusion (CTO) sites at 3, 10, and 28 days with the day-10 specimens showing the maximum calcium. The animal work showed a successful CTO model was developed using osteoblast seeded PCL scaffolds with TGF-beta1 loading. This dissertation demonstrated the successful creation of a novel calcified CTO model in an animal artery using tissue-engineering strategies. This CTO model can be used to develop new devices and therapies to treat total atherosclerotic occlusion.
  • Item
    Classification of EEG recordings without perfect time-locking
    (2012) Zhu, Manli
    It has been established that neural response is time-locked to stimulus; however, the latencyin between may vary because of stimulus strength, subject fatigue, distraction, etc. Instead of assuming perfect time-locking between stimulus and its neural response, we proposed here a statistical model that admits latency variation. We tested the approach on an EEG data set from an image Rapid Serial Visual Presentation (RSVP) experiment. Results show that the proposed approach consistently outperforms those relying on perfect time-locking. In addition, our approach can predict the stimulus' onset time when this information is not available.
  • Item
    Intimate partner violence prevalence in a sample of veteran and civilian college students
    (2012) Zuniga, Steven Michael
    Intimate partner violence (IPV) has become an increasingly important issue, but little research has been conducted regarding military college samples. The majority of previous studies have focused on other factors related to IPV including gender, ethnicity, and alcohol-usage. This study sought to examine the prevalence of IPV in both military and civilian (non-military) samples within a college environment. Results found that a relationship existed between IPV and gender, as females were much more likely to engage in IPV than males. Ethnicity was not found to be significant. Military status was unable to be examined due to a small sample size.
  • Item
    An Empirical Study on Security Vulnerabilities in Online Docker Images
    (2020) Zou, Xiaochen
    This paper presents an empirical study on the security vulnerabilities in docker images that are public available at Docker Hub repository. To perform the study, I developed an automatic tool ImageCheck to collect installed libraries in a docker image and check the library versions against the CVE database to detect potential vulnerabilities in docker images. The study uses 1,487 most downloaded free docker images as subjects and considers all CVE vulnerabilities published from Jan. 2018 to Feb. 2020. ImageCheck detects 507 vulnerabilities from 250 docker images, and the study results show that these vulnerabilities cover a large number of docker image categories and vulnerability categories.
  • Item
    Two Essays on Value Creation in Platform Market
    (2021) Zhou, Qiang
    My two essays explore marketing strategies and problems associated with value creation in platform market. In essay 1, I study online, pure-labor service platforms (e.g., Zeel, Amazon Home Services, Freelancer.com). An increasing managerial concern in such platforms is the opportunistic behavior of service agents who defect with customers off platform for future transactions, labeled as platform exploitation. Using multiple studies (e.g., Interviews, Secondary data analysis, Experiments), the findings suggest that high-quality, long-tenured service agents may enhance platform usage, but customers also are more likely to defect with such agents. Platform exploitation also increases with greater customer–agent interaction frequency (i.e., building stronger relationships). This phenomenon decreases agents' platform usage, due to capacity constraints caused by serving more customers off platform. These effects are stronger as service price increases (as higher prices equate to more fee savings), as service repetitiveness increases, and as the agent's on-platform customer pool comprises more repeat and more proximal customers. Finally, I provide managerial strategies to combat platform exploitation. In essay 2, I study bundling in product platform market (e.g., videogame-console bundle). Previous research largely focused on examining effects of introducing such bundles. Yet, few studies look at the strategic considerations regarding the bundle choice. In this research, I examine two important factors of system products, namely product lifecycle and network competitive position, that impact bundle introduction. Unlike previous research that assume a single firm decision, I utilize a two-sided matching approach that assumes hardware and software makers as independent and jointly making bundle decisions, thus both sides' preferences are considered. By analyzing a dataset of US video game industry that contains console-game bundles between 1995 and 2015, I found that bundling a high-quality or a new game at the later stage of console lifecycle can significantly increase payoff for both sides and bundling a high-quality game when a console is a market underdog (with least market share) can also enhance payoff for both. Whereas, bundling a new game with an underdog console may only benefit console maker but hurt game developer.
  • Item
    A Featherweight Deadlock Detection and Prevention System for Production Software
    (2017) Zhou, Jinpeng
    Deadlock is a critical problem that halts parallel programs with no further progress. Pro- grammers may need to make tremendous efforts to achieve deadlock-free, because it requires pro- found understanding of synchronization logic especially when the program is highly concurrent with many threads or processes. Existing detection tools suffer from significant recording per- formance overhead and excessive memory overhead. Furthermore, they may introduce numerous false alarms, which requires tremendous manual efforts to confirm and fix these deadlocks. This thesis proposes a novel library-based runtime system, named as UNDEAD, for defeating resource deadlocks related to mutex locks in production software. Different from existing detection tools, UNDEAD imposes negligible runtime performance overhead (2% on average), and 14% memory overhead, based on our evaluation on PARSEC benchmarks and seven real applications, including MySQL, Apache, SQLite, Memcached, Aget, Pbzip2 and Pfscan. Based on the detection results, UNDEAD automatically strengthens erroneous programs with the capability to prevent future occurrences of all detected deadlocks, which is similar to the ex- isting work—Dimmunix. However, UNDEAD exceeds Dimmunix with several orders of magni- tude lower performance overhead, and eliminates numerous false positives. The advantages in- cluding extremely low overhead, bounded memory/storage overhead, automatic prevention make UNDEAD an convenient, always-on detection and a "band-aid" prevention system for production software. In this thesis, we also provide a case study, then extract some basic definitions and properties from our case study for detecting communication deadlocks related to conditional variables. Fur- thermore, We design a prototype which can be easily integrated into UNDEAD as an enhancement.