Medicine, Biological and Life Sciences: Recent submissions
Now showing items 1-20 of 1369
-
Organisational contextual drivers of evidence-based practice across acute and primary careBackground: Evidence based practice (EBP) is widely recognised as fundamental to high quality nursing care, yet implementation remains uneven across healthcare settings in England. Attention has shifted from individual barriers to organisational context. Leadership, team dynamics, access to resources, and social capital shape how nurses engage with EBP. Despite national policies promoting research active environments, how these ambitions are realised at the frontline is unclear. This study examined how organisational factors influence nurses’ implementation of evidence across acute and primary care. Methods: A cross-sectional design was used with registered nurses working in acute and primary care settings. Two validated instruments, the Evidence Based Practice Implementation Scale and the Alberta Context Tool, were administered. A nonprobability sampling strategy targeted the acute and general practice nursing workforce. Response distributions were monitored across pre specified strata and fieldwork closed once coverage and precision criteria were met. Descriptive statistics summarised participant and organisational characteristics. Inferential analyses compared settings, mediation modelling tested the role of social capital in the leadership to EBP pathway, and cluster analysis identified implementation profiles. Results: Engagement with EBP was moderate overall (M = 3.16, SD = 0.88) with no significant difference between sectors (p = 0.38). Acute care nurses reported higher leadership support (M = 4.01 versus 3.78, p = 0.008) and better access to structural resources (M = 3.35 vs. 3.10, p = 0.004). Within acute care, leadership differed across specialties, with higher scores in ICU or CCU and general medicine, F (4, 636) = 4.12, p = 0.003. Social capital significantly mediated the association between leadership and EBP implementation (β = 0.15, 95% CI 0.10–0.21). Three engagement clusters were identified, high 32%, moderate 45%, and low 23%, each with distinct organisational profiles. Conclusion: Organisational context, particularly leadership and social capital, is central to nurses’ capacity to implement evidence. Variation across specialties and sectors indicates that a one size fits all approach is unlikely to succeed. Policy relevant levers include formalising protected time, resourcing embedded facilitation, investing in knowledge infrastructure, and expanding clinical academic pathways, to create environments where evidence use is routine and supported.
-
A situation analysis of diagnostic and management strategies for gestational Urinary Tract Infections (UTIs) in Kisumu County, Kenya: Maternal health implications and opportunities for diagnostic improvementUrinary tract infections (UTIs) are linked to adverse pregnancy outcomes, yet epidemiological data on gestational UTIs in Kenya are limited. This study assessed diagnostic and management practices in Kisumu County to inform diagnostic and antimicrobial stewardship. A hospital-based retrospective study was conducted from February 2020 to February 2021 among 416 records of pregnant women at Chulaimbo and Nyahera Sub-County Hospitals. Socio-demographic, laboratory, and clinical history data were collected using structured forms and analysed in STATA 16.0. Statistical methods included chi-square, multivariate logistic regression, and Spearman’s rank correlation (p ≤ 0.05). Dipstick-based presumptive proportion of UTIs was 57.9% (241/416). Only 1.4% (6/416) had microbiological confirmation despite infections being recorded. The mean maternal age was 23.92 years, parity two, mean antenatal visits two, and mean haemoglobin 10.73 ± 1.8 g/dL. The first antenatal care attendance occurred at varying gestational ages in 56% (233/416). Antibiotics prescribed were 60% from WHO ‘Access’ group and 40% from ‘Watch’ group. Gestational UTI’s in Kisumu County were frequently managed without confirmatory diagnosis, increasing antimicrobial resistance risk. Strengthening management requires better laboratory capacity, sustained financial investment, improved antibiotic access, and adherence to WHO AWaRe guidelines to protect maternal and neonatal health.
-
Cardiovascular disease in the context of Metabolic Dysfunction-Associated Steatotic Liver Disease (MASLD): A comprehensive narrative reviewMetabolic dysfunction-associated steatotic liver disease (MASLD) is a chronic hepatic disease with a rising global prevalence (25–38% of the general population). As a new term, MASLD was introduced in 2023 to replace the previous nomenclature of non-alcoholic fatty liver disease (NAFLD) and metabolic dysfunction-associated fatty liver disease (MAFLD). This new term/definition introduced changes in the diagnostic criteria and underscores the direct link between cardio-metabolic risk and this prevalent liver disease. In this context, the present review examines the clinical and pathophysiological links between MASLD and cardiovascular disease (CVD), providing a robust evidence synthesis of primarily systematic review data on the association between MASLD and coronary artery disease (CAD), atrial fibrillation (AF), and heart failure (HF). This association appears to be not only synergistic, but also independent of other known CVD risk factors, highlighting MASLD as a key cardio-metabolic risk factor that merits prompt diagnosis and treatment. The development of MASLD-related cardiovascular morbidity increases with the severity of the underlying hepatic pathology, particularly with progression to steatohepatitis and fibrosis. Notably, growing evidence highlights the links between MASLD and CVD through cardiac structural, electrical, and functional alterations that can progress to CAD, AF, and new-onset HF. Recognizing these links in clinical practice underscores the importance of early detection and multi-disciplinary management of MASLD to prevent disease progression and CVD complications.
-
Eating well when living with an intellectual disability—Exploring the carer: client relationship in residential settingsBackground: Individuals with intellectual disabilities face increased risks of obesity and health issues. Carers in residential settings play a crucial role in shaping their dietary habits. This study explores how carers influence eating behaviours to identify strategies for healthy eating. Method: Seventeen semi-structured interviews were conducted with carers from three community homes. Thematic analysis identified three key themes: (i) whose responsibility is it?; (ii) food autonomy and choice in the context of caring relationships; (iii) opportunities for working together to support dietary choices. Results: Carers strive to encourage healthy eating while respecting residents' autonomy, particularly in those with cognitive impairments or on psychotropic medications. They use strategies like rapport-building, personalised care, and nutrition education. However, these efforts are limited by gaps in knowledge, time constraints, and variation in application and impact. Conclusions: Findings highlight the practice gap and the need for better training and resources to support carers in promoting healthy food choices while respecting residents' autonomy.
-
Organ wide toxicological assessment of common edible herbs and their mixtures as used in home remediesThe use of home remedies for medicinal purposes, most of which are edible plants has continued to be a practice in many homes. However, there has been an increasing report of chronic use with lethal effect. Among the commonly used herbal/ medicinal plants were ginger, garlic and lemon. These were seen to be prevalent across continents with brewing and crude extraction being the most means of consumption. This study investigated the organ wide toxicity of this extract following chronic consumption of crude extract. Twenty-five albino Wister rats, five in each group were used for this experiment. Each animal received 0.5ml/kg body weight of either ginger extract, garlic extract, lemon juice, or a mixture of equal volumes of all three extract (v/v) respectively twice daily for seven (7) days. Statistics were represented as ±SE; P≤0.05 was considered significant. Previous studies have shown that moderate consumption of these medicinal plants were beneficial and have shown no deleterious effect. This study observed no change in the weight of the experimental animals. The weight of the animals continued to increase except for the group that received lemon and the mixture, but these were not significant. It was observed that chronic consumption induced organ wide toxicity to include the liver, kidney, intestinal epithelium, stomach, and pancreas. These were shown to alter tissue architecture and the cell morphology. Packed cell volume was reduced in the lemon and the group that received a combination of all extracts (p=o.03). Blood differentials showed changes in levels. An elevated basophil level was observed in ginger and garlic (p<0.0001; p=0.0006). Monocyte levels increased progressively across each group when compared to the control with the most elevated level seen in the group that received the mixture (p<0.0001). Lymphocyte count was reduced across all the groups that received the extract except for animals that received ginger. This study suggests the application of caution among users of these medicinal plants and continues to draw attention to the need for harmonization and standardization of safe use doses.
-
Awareness and Knowledge of Antimicrobial Resistance, Antimicrobial Stewardship and Barriers to Implementing Antimicrobial Susceptibility Testing among Medical Laboratory Scientists in Nigeria: A Cross-Sectional StudyBackground: Antimicrobial resistance (AMR) is now considered one of the greatest global health threats. This is further compounded by a lack of new antibiotics in development. Antimicrobial stewardship programmes can improve and optimize the use of antibiotics, thereby increasing the cure rates of antibiotic treatment and decreasing the problem of AMR. In addition, diagnostic and antimicrobial stewardships in the pathology laboratories are useful tools to guide clinicians on patient treatment and to stop the inappropriate use of antibiotics in empirical treatment or narrow antibiotics. Medical Laboratory Scientists are at the forefront of performing antibiotics susceptibility testing in pathology laboratories, thereby helping clinicians to select the appropriate antibiotics for patients suffering from bacterial infections. Methods: This cross-sectional study surveyed personal antimicrobial usage, the knowledge and awareness on AMR, and antimicrobial stewardship, as well as barriers to antimicrobial susceptibility testing among medical laboratory scientists in Nigeria using pre-tested and validated questionnaires administered online. The raw data were summarized and exported in Microsoft Excel and further analyzed using IBM SPSS version 26. Results: Most of the respondents were males (72%) and 25–35 years old (60%). In addition, the BMLS degree was the highest education qualification most of the respondents (70%) achieved. Of the 59.2% of the respondents involved in antibiotics susceptibility testing, the disc diffusion method was the most commonly used (67.2%), followed by PCR/Genome-based detection (5.2%). Only a small percentage of respondents used the E-test (3.4%). The high cost of testing, inadequate laboratory infrastructure, and a lack of skilled personnel are the major barriers to performing antibiotics susceptibility testing. A higher proportion of a good AMR knowledge level was observed in male respondents (75%) than females (42.9%). The knowledge level was associated with the respondent’s gender (p = 0.048), while respondents with a master’s degree were more likely to possess a good knowledge level of AMR (OR: 1.69; 95% CI: 0.33, 8.61). Conclusion: The findings of this study indicate that Nigerian medical laboratory scientists had moderate awareness of AMR and antibiotic stewardship. It is necessary to increase investments in laboratory infrastructure and manpower training, as well as set up an antimicrobial stewardship programme to ensure widespread antibiotics susceptibility testing in hospitals, thereby decreasing empirical treatment and the misuse of antibiotics.
-
A generic theory of change-based framework with core indicators for monitoring the effectiveness of large-scale food fortification programs in low- and middle-income countriesLarge-scale food fortification (LSFF) programs are widely implemented in low- and middle-income countries (LMIC) to alleviate micronutrient deficiencies. However, these programs may not achieve the desired impact due to poor design or bottlenecks in program implementation. Monitoring and evaluation (M&E) frameworks and a set of agreed indicators can help to benchmark progress and to strengthen the evidence-base of effectiveness in a standardized way. We aimed to formulate recommendations towards core indicators for evaluating the effectiveness of LSFF programs with their associated metrics, methods, and tools (IMMT). For this, we used a multi-method iterative approach, including a mapping review of the literature, semi-structured interviews with international experts, compilation of a generic Theory of Change (ToC) framework for LSFF program delivery, and selection of IMMT for M&E of LSFF programs at key stages along the ToC delivery framework. Lastly, we conducted exploratory, qualitative interviews with key informants in Nigeria to explore experiences and perceptions related to the implementation of LSFF programs in Nigeria's context, and their opinion towards the proposed set of core IMMT. The literature search resulted in 14 published and 15 grey literature documents, from which we extracted a total of 41 indicators. Based on the available literature and interviews with international experts, we mapped a ToC delivery framework and selected nine core indicators at the output, outcome and impact level for M&E of the effectiveness of LSFF programs. Key informants in Nigeria revealed that the main bottlenecks for implementation of the proposed IMMT are related to the lack of technical capacity, equipment, laboratory infrastructure, and financial resources. In conclusion, we propose a set of nine core indicators for enabling comprehensive M&E of the effectiveness of LSFF programs in LMIC. This proposed set of core indicators can be used for further evaluation, harmonization and integration in national and international protocols for M&E of LSFF programs.
-
Effect of a fortified dairy-based drink on micronutrient status, growth, and cognitive development of Nigerian toddlers- a dose-response studyMalnutrition results in a high prevalence of stunting, underweight, and micronutrient deficiencies. This study investigated the effect of a multi-nutrient fortified dairy-based drink on micronutrient status, growth, and cognitive development in malnourished [height-for-age z-score (HAZ) and/or weight-for-age z-score (WAZ) < -1 SD and >-3 SD] Nigerian toddlers (<i>n</i> = 184, 1-3 years). The product was provided in different daily amounts (200, 400, or 600 ml) for 6 months. At baseline and endline, venous blood and urine samples were collected to determine micronutrient status. Bodyweight, height, waist, and head circumference were measured, and corresponding Z-scores were calculated. The Bayley-III Screening Test was used to classify the cognitive development of the children. In a modified per-protocol (PP) population, the highest prevalence's of micronutrient deficiencies were found for vitamin A (35.5%) and selenium (17.9%). At endline, there were no significant improvements in iodine, zinc, vitamin B12, and folate status in any of the three groups. Regarding vitamin D status (25OHD), consumption of 600 and 400 ml resulted in an improved status as compared to baseline, and in a difference between the 600- and 200-ml groups. Consumption of 600 ml also increased vitamin A and selenium status as compared to baseline, but no differences were found between groups. Within the groups, WAZ, weight-for-height z-score (WHZ), and BMI-for-age z-score (BAZ) improved, but without differences between the groups. For HAZ, only the 600 ml group showed improvement within the group, but it was not different between groups. For the absolute weight, height, and head circumference only trends for differences between groups were indicated. Cognition results did not differ between the groups. Within groups, all showed a decline in the per cent of competent children for receptive language. To study the effects of a nutritional intervention on linear growth and cognition, a longer study duration might be necessary. Regarding the improvement of micronutrient status, 600 ml of fortified dairy-based drink seems most effective.
-
Multi-nutrient fortified dairy-based drink reduces anaemia without observed adverse effects on gut microbiota in anaemic malnourished Nigerian toddlers: A randomised dose–response studyPrevalence of anaemia among Nigerian toddlers is reported to be high, and may cause significant morbidity, affects brain development and function, and results in weakness and fatigue. Although, iron fortification can reduce anaemia, yet the effect on gut microbiota is unclear. This open-label randomised study in anaemic malnourished Nigerian toddlers aimed to decrease anaemia without affecting pathogenic gut bacteria using a multi-nutrient fortified dairy-based drink. The test product was provided daily in different amounts (200, 400 or 600 mL, supplying 2.24, 4.48 and 6.72 mg of elemental iron, respectively) for 6 months. Haemoglobin, ferritin, and C-reactive protein concentrations were measured to determine anaemia, iron deficiency (ID) and iron deficiency anaemia (IDA) prevalence. Faecal samples were collected to analyse gut microbiota composition. All three dosages reduced anaemia prevalence, to 47%, 27% and 18%, respectively. ID and IDA prevalence was low and did not significantly decrease over time. Regarding gut microbiota, <i>E</i><i>nterobacteriaceae</i> decreased over time without differences between groups, whereas <i>Bifidobacteriaceae</i> and pathogenic <i>E. coli</i> were not affected. In conclusion, the multi-nutrient fortified dairy-based drink reduced anaemia in a dose-dependent way, without stimulating intestinal potential pathogenic bacteria, and thus appears to be safe and effective in treating anaemia in Nigerian toddlers.
-
Growth and micronutrient status parameters of Nigerian preterm infants consuming preterm formula or breastmilkBackground: Moderate-to-late preterm infants (32–34 weeks GA) have increased risk of neonatal morbidities compared to term infants, however dedicated nutritional guidelines are lacking. Methods: Moderate-to-late preterm infants received a preterm formula (n = 17) or breastmilk (n = 24) from age 2–10 weeks in a non-randomized, open-label observational study. Anthropometric measurements were assessed bi-weekly. Blood concentrations of hemoglobin, ferritin, serum retinol, and 25-hydroxy-vitamin D (25OHD) were analyzed at age 2 and 10 weeks. Result: Average growth per day was 14.7 g/kg BW/day in formula-fed and 12.8 g/kg BW/day in breastmilk-fed infants but not different from each other. Length and head circumference in both groups were in line with the median reference values of the Fenton growth chart. At 10 weeks of age, hemoglobin tended to be higher in the formula-fed group (10.2 g/dL vs. 9.6 g/dL, p = 0.053). 25OHD increased in formula- and breastmilk-fed infants from 73.8 to 180.9 nmol/L and from 70.7 to 97.6 nmol/L, respectively. Serum retinol only increased in the formula-fed group (0.63 to 1.02 µmol/L, p < 0.001). Conclusion: Breastfeeding resulted in adequate growth in moderate-late preterm infants but was limiting in some micronutrients. The preterm formula provided adequate micronutrients, but weight gain velocity was higher than the Fenton reference value. Impact statement: Unfortified breastmilk resulted in adequate growth in weight, length and head circumference in Nigerian moderate to late preterm infants during an study period of 8 weeks, but status of vitamin D, vitamin A and iron needs to be monitored. The high-energy formula, developed for very preterm infants, resulted in higher growth in body weight in moderate to late preterm infants than the median of the Fenton preterm growth chart. This study supports the necessity of dedicated nutritional guidelines, and regular monitoring of growth and nutritional status of moderate to late preterm infants.
-
Metabolic, androgenic, and physical activity profiles in women aged over 40 years with polycystic ovary syndrome: A comparative analysis using UK Biobank dataBackground: Polycystic ovary syndrome (PCOS) is the most common endocrine disorder in reproductive-aged women, linked to metabolic, hormonal, and psychological issues. Management typically involves lifestyle changes, including increased physical activity and reduced sedentary behaviour. Objectives: To compare the health profiles and behaviours of women with and without PCOS. Design: This study analysed data from the UK Biobank, which is a prospective cohort study. Methods: Women with PCOS in the UK Biobank were identified, while age- and body mass index (BMI)-matched controls were randomly selected. Data on factors associated with PCOS severity and self-reported lifestyle behaviours were analysed. Group differences were tested for significance, and participants were categorised by health behaviours to assess morbidity risk. Results: The study included 319 women with PCOS (mean age: 43.9 years) and 638 in each control group. Significant differences (p < 0.05) were observed in anthropometric (e.g. body weight, BMI, waist and hip circumference, and body fat), cardio-metabolic (e.g. blood pressure, triglycerides, and glycated haemoglobin), and androgenic (e.g. sex hormone-binding globulin) indices. Differences were most pronounced between PCOS and age-matched controls but remained when BMI was also considered. Women with PCOS engaged in less vigorous physical activity and had higher screen time and sedentary behaviours. Those with the lowest physical activity and highest sedentary time had the worst health profiles and highest morbidity risk, regardless of group. Conclusion: Women with PCOS exhibit poorer health despite only slight lifestyle differences. Across all participants, lower physical activity and higher sedentary behaviour were linked to increased health risks. Further research is needed to clarify causal relationships between lifestyle factors and PCOS.
-
Loss of the RNA binding protein HuR in early murine limb mesenchyme does not affect development but leads to impaired bone homeostasis in adulthoodIn this study, we examined how a critical posttranscriptional regulator, the RNA‐binding protein HuR (gene name Elavl1), contributes to the development and maintenance of limb skeletal tissue. Using the Prx1‐Cre knockout model, we examined the effect of germline knockout (Elavl1KO) and limb mesenchyme‐specific knockout (MSC‐Elavl1KO) of HuR on limb development. We found that Elavl1KO disrupted the development of the limb skeleton and was associated with a loss of signaling from the apical ectodermal ridge (AER). In contrast, MSC‐Elavl1KO did not appear to affect skeletal development. Mature MSC‐Elavl1KO mice appeared healthy, but their limb skeleton exhibited abnormal bone structure in both males and females at 2.5 months of age. Osteoblasts isolated from MSC‐Elavl1KO mice exhibited lower expression of osteoblastic marker genes, and their ability to generate a mineralized matrix was markedly impaired. RNA‐Seq analysis of these osteoblasts demonstrated that loss of HuR substantially influenced their transcriptome, affecting genes associated with a wide range of cellular processes. Finally, using siRNA knockdown in the human MG63 cell line, we identified that loss of HuR leads to increased mRNA turnover of the osteoblastic transcription factor Runx2. Overall, the study has demonstrated a critical role for HuR‐mediated posttranscriptional control in skeletal development and homeostasis, but finds that its expression in mesenchyme‐derived cells only becomes critical in mature skeletal tissue.
-
Erratum to: An interpretative phenomenological analysis (IPA) of coercion towards community dwelling older adults with dementia: findings from Mysore studies of natal effects on ageing and health (MYNAH)A co-author’s name was published incorrectly in the original publication of the article. The author name “Muffadal Bharmal” should be “Mufaddal Bharmal”. The original article has been updated accordingly.
-
Mapping the age of autistic spectrum condition diagnosis, affected by sex and Intellectual disabilityIntroduction/purpose: Autism is a complex neurodevelopmental condition thought to affect 1 in 100 children globally. More commonly diagnosed in males, and during childhood, diagnoses are increasingly being made throughout adulthood. Purpose: To establish what age autistic people receive their diagnosis, and whether the age of diagnosis was influenced by their sex and by the presence of intellectual disability. Design: A quantitative, cross-sectional, retrospective study. Data was collected from the Primary Care records of 6 GP Practices covering Ellesmere Port, a large town in Northwest England with 71,210 people registered. Mean age of diagnosis was calculated for the group then for each subgroup, to allow comparison between males and females, and those with and without a documented intellectual disability. Findings: Data from 1130 autistic participants were analysed. Age of participants was between 3 - 81 years with an age of autism diagnosis of 1 - 72 years. 85.6% of participants were diagnosed with autism by the age of 25 years, most commonly at 3 years of age (11.9%). The average age of diagnosis was 2.48 years later for females diagnosed across the lifespan. Average age of diagnosis was 5.05 years later for those with a learning disability. Practical implications: This study highlights the importance of healthcare professionals, educators and care givers recognising autistic traits in people across the lifespan, including the potential for diagnostic overshadowing. There are implications for commissioning autistic services, to ensure adequate assessment pathway capacity for adolescents and adults as well as children.
-
Educational strategies for managing moral distress in student nurses: A scoping reviewAims: To explore what content, teaching and learning activities are advocated by nurse educators to mitigate moral distress and related concepts in student nurses. Review Methods: The review was conducted according to Joanna Briggs Institute guidelines. The search strategy adopted their three-step method for systematic reviews. The eligibility criteria reflected the Population, Concept, Context format. Results: Following searches, 3809 records were screened against eligibility criteria, resulting in 42 eligible papers being included; 29 research studies and 13 non-empirical papers. We identified 236 content suggestions, mapped to 70 subject codes. Also, 217 teaching and learning activities are suggested and mapped to 41 coded activities. Data is charted in tables and figures and results are discussed per related concept of moral distress. Conclusions: Educational content, and teaching and learning activities are heterogenous across the concepts influencing moral distress. There is overlap of content across different concepts. Moral sensitivity received the most publications. Development of research and educational strategies addressing other interrelated concepts would be advantageous for evidence-based curriculum development. Recommendations are made to develop evidence-based content and teaching and learning activities.
-
Association between sugar-sweetened beverage consumption and constructs of the Health Belief Model in young adult students at the University of ChesterBackground: Young adults are reported as one of the major consumers of sugar-sweetened beverages (SSBs) globally and the consumption of SSBs is associated with long-term medical conditions. The Health Belief Model (HBM) has been applied to understand SSB consumption by children and adolescents through the use of its constructs. There is a gap in knowlege of HBM application to SSB consumption of young adults. Therefore, the present study was undertaken to fill this gap. Objective: The primary objective of the study was to determine if an association existed between SSB consumption status and the constructs of the HBM: perceived susceptibility, perceived severity, perceived barriers, perceived benefits, self-efficacy, and cues in young adult students. Design: Cross-sectional data were obtained using an online self-administered structured questionnaire. Descriptive statistics and Chi-squared (X2) test for association were used to analyse the data. Setting: Participants were recruited via email and WhatsApp. Participants: Seventy young adult students aged 18 to 30 years studying at the University of Chester, England. Results: The mean age of the participants was 25.5 years (SD: 3.0). 53% consumed SSBs on a given day. Postgraduates (58%), Blacks (52%), and Asian students (70%) had the greater percentages of SSB consumers. There was a very strong significant association between SSB consumption status and level of perceived severity of diseases from a high intake of SSBs, X2 (1, N = 70) = 6.94, P = 0.01, Cramer’s V = 0.32. Also, a very strong association existed between SSB consumption status and self-efficacy level to control SSB intake, X2 (1, N = 70) = 8.83, P = 0.00, Cramer’s V = 0.36. Conclusions: A high percentage of young adult students especially those from minority ethnic groups in the UK consumed SSBs which indicates that targeted initiatives at these groups are required to control their intake. Interventions to control SSB intake in young adult students should consider increasing their level of perceived severity of diseases from a high intake of SSBs and self-efficacy to control SSB intake. We recommend actions that can further increase their awareness of how serious diseases from a high intake of SSBs are. We also recommend measures intended to increase the confidence of young adult students in their ability to avoid SSBs when they are stressed, after writing exam, when they are eating, and when they engage in sedentary activities. In particular, modifying their environment to prevent the availability of SSBs at home and regulating the promotion and cost (via taxation) of SSBs in stores would help to increase their self- efficacy level. Further studies to determine why a greater percentage of postgraduate students are consuming SSBs despite their level of education is required. More comprehensive research on SSB consumption in young adults using the HBM is necessary.
-
Analytical data of Acacia nilotica var. Nilotica gumThis study aimed to characterize the exudate gum from Acacia nilotica var. nilotica in Sudan and compare its physicochemical properties to Acacia seyal var. seyal and Acacia senegal var. senegal (gum Arabic). Samples were collected from six different states in Sudan over three seasons. The gum had a moisture content of 10.50%, ash content of 1.86%, pH value of 5.19, specific optical rotation of +94.70, intrinsic viscosity of 10.44 cm3 g-1, nitrogen content of 0.024%, protein content of 0.16%, acid equivalent weight of 1907.82, and total uronic acid content of 10.18%. Sugar content analysis revealed arabinose (41.20%), galactose (17.43%), and rhamnose (10.68%). Potassium was the predominant cation, followed by calcium, magnesium, sodium, lead, and iron. Acacia nilotica was classified as part of the Gummeferae series and exhibited a positive specific optical rotation. The Number average molecular weight (Mn) was estimated using osmometric measurements and gel permeation chromatography. The gum had a higher molecular weight and lower intrinsic viscosity compared to gum Arabic, suggesting a spheroidal shape of molecule. Amino acid analysis showed similarities with gum Arabic, with hydroxyproline and serine as principal amino acids. Variations in cationic composition were attributed to differences in soil type among collection locations.
-
Physical exercise impacts the performance of explosives detection dogsDogs (Canis familiaris) are widely used as scent detectors due to their sensitive olfactory capabilities, endurance, and ability to cover large areas quickly. They are in high demand due to a global rise in terrorist threats using specialized explosive contraptions. Detection dogs are often faced with high temperatures and physical exertion, which can increase panting rate as a function of evaporative cooling, inhibiting olfactory ability. This study examined the impact of exercise on the search performance of 11 explosive detection dogs (eight labradors and three springer spaniels). They completed two trial sets: one after exercise with a ball thrower and one without exercise. They were timed while searching for three types of explosives: trinitrotoluene, composition-4, and ammonium nitrate. Data were analyzed in R using mixed effects models, revealing that exercise significantly affected search duration and success for all types of explosives. Searches averaged 29.58 seconds without exercise, while post-exercise searches took 44.91 seconds. Dogs were 1.14 times more likely to locate explosives without prior exercise. Dogs took the longest to find trinitrotoluene and were fastest with ammonium nitrate and composition-4. These findings highlight the importance of allowing detection dogs adequate rest, as even brief exercise can impact their search performance.
-
Antibiotic use among university students in malaria therapy and its implications for antimicrobial resistance in Nigeria: a quantitative cross-sectional studyBackground: Antimicrobial resistance (AMR) is a global health crisis, driven partly by inappropriate antibiotic use. In Nigeria, malaria remains highly prevalent and often mismanaged with antibiotics, particularly in presumed malaria-typhoid co-infections. This study examined patterns of antibiotic use in malaria treatment among university students, highlighting implications for AMR. Methods: A cross-sectional survey was conducted among undergraduates purposively selected from 12 universities across Nigeria’s six geopolitical zones. Data were collected via validated online questionnaires (February–March 2025) and analysed using descriptive statistics, chi-square tests, logistic regression, and Spearman correlation (SPSS v26). Results: Of 646 respondents, > 97% demonstrated general antibiotic knowledge, yet 27.6% misidentified chloroquine as an antibiotic. While 94.6% correctly recognised antibiotics for bacterial infections, about one-fifth believed they were effective against fungal, parasitic, or viral diseases. Despite 84.7% AMR awareness, 49.1% reported using antibiotics for malaria treatment. Misuse was highest in the Northeast (62.3%), Northwest (63.7%), and South-South (32.9%). In the Northeast, key drivers included prior experience (35.4%), pharmacist advice (29.9%), and peer influence (28.0%), while only 6.7% followed physician prescriptions. Misuse correlated with the belief that antibiotics treat all illnesses (rs = 0.329, p < 0.001). Nearly half (49.5%) accessed antibiotics without prescriptions. Conclusions: High AMR awareness contrasts with persistent misuse of antibiotics for malaria, reflecting misconceptions, regional disparities, and weak regulation. Targeted education, stricter antibiotic controls, and improved diagnostics are urgently needed to curb AMR in Nigeria.










