Notes
![]() ![]() Notes - notes.io |
The horizontal inequality index for adult obesity decreased from 0.1377 in 1997 to 0.0164 in 2011; for central obesity, it decreased from 0.0806 in 1997 to -0.0193 in 2011. The main causes of inequality for both diseases are, among others, economic status, marital status and educational attainment.
From 1997 to 2011, the prevalence of adult obesity and central obesity increased annually. The pro-rich inequalities in both adult and central obesity decreased from 1997 to 2011. The inequality in central obesity was more prominent in the low-income group in 2011. Future policies may need to address obesity reduction among the poor.
From 1997 to 2011, the prevalence of adult obesity and central obesity increased annually. The pro-rich inequalities in both adult and central obesity decreased from 1997 to 2011. The inequality in central obesity was more prominent in the low-income group in 2011. Future policies may need to address obesity reduction among the poor.When the eyes rotate during translational self-motion, the focus of expansion (FOE) in optic flow no longer indicates heading, yet heading judgements are largely unbiased. Much emphasis has been placed on the role of extraretinal signals in compensating for the visual consequences of eye rotation. However, recent studies also support a purely visual mechanism of rotation compensation in heading-selective neurons. Computational theories support a visual compensatory strategy but require different visual depth cues. We examined the rotation tolerance of heading tuning in macaque area MSTd using two different virtual environments, a frontoparallel (2D) wall and a 3D cloud of random dots. Both environments contained rotational optic flow cues (i.e., dynamic perspective), but only the 3D cloud stimulus contained local motion parallax cues, which are required by some models. The 3D cloud environment did not enhance the rotation tolerance of heading tuning for individual MSTd neurons, nor the accuracy of heading estimates decoded from population activity, suggesting a key role for dynamic perspective cues. We also added vestibular translation signals to optic flow, to test whether rotation tolerance is enhanced by non-visual cues to heading. We found no benefit of vestibular signals overall, but a modest effect for some neurons with significant vestibular heading tuning. We also find that neurons with more rotation tolerant heading tuning typically are less selective to pure visual rotation cues. Together, our findings help to clarify the types of information that are used to construct heading representations that are tolerant to eye rotations.Due to challenges in performing routine personalized dosimetry in radiopharmaceutical therapies, interest in single-time-point (STP) dosimetry, particularly utilizing only one SPECT scan, is on the rise. Meanwhile, there are questions about reliability of STP dosimetry, with limited independent validations. In the present work, we analyze two STP dosimetry methods and evaluate dose errors for a number of radiopharmaceuticals based on effective half-life distributions. Method We first challenged the common assumption that radiopharmaceutical effective half-lives across the population are Gaussian (normal) distributed. Then, dose accuracy was estimated based on two STP dosimetry methods for a wide range of potential scan-times post-injection (p.i.), for different radiopharmaceuticals applied to neuroendocrine tumors and prostate cancer. The accuracy and limitations of using each of the STP methods were discussed. Results Log-normal distribution was shown as more appropriate to capture effective half-life distributions. The STP framework was shown as promising for dosimetry of 177Lu-DOTATATE, and for kidney dosimetry of different radiopharmaceuticals (errors less then 30%). Meanwhile, for some radiopharmaceuticals, STP accuracy is compromised (e.g. in bone marrow and tumors for 177Lu-PSMA therapies). Optimal SPECT scanning time for 177Lu-DOTATATE is at ~72 h p.i., while 48 h p.i. would be better for 177Lu-PSMA compounds. Conclusion Our results demonstrate that simplified STP dosimetry methods may compromise the accuracy of dose estimates, with some exceptions such as for 177Lu-DOTATATE and for kidney dosimetry in different radiopharmaceuticals. Simplified personalized dosimetry in the clinic continues to be a challenging task. Based on these results, we make suggestions and recommendations for improved personalized dosimetry using simplified imaging schemes.Accurate delineation of the intraprostatic gross tumour volume (GTV) is a prerequisite for treatment approaches in patients with primary prostate cancer (PCa). Prostate-specific membrane antigen positron emission tomography (PSMA-PET) may outperform MRI in GTV detection. read more However, visual GTV delineation underlies interobserver heterogeneity and is time consuming. The aim of this study was to develop a convolutional neural network (CNN) for automated segmentation of intraprostatic tumour (GTV-CNN) in PSMA-PET. Methods The CNN (3D U-Net) was trained on 68Ga-PSMA-PET images of 152 patients from two different institutions and the training labels were generated manually using a validated technique. The CNN was tested on two independent internal (cohort 1 68Ga-PSMA-PET, n = 18 and cohort 2 18F-PSMA-PET, n = 19) and one external (cohort 3 68Ga-PSMA-PET, n = 20) test-datasets. Accordance between manual contours and GTV-CNN was assessed with Dice-Sørensen coefficient (DSC). Sensitivity and specificity were calculated for the two internal test-datasets (cohort 1 n = 18, cohort 2 n = 11) by using whole-mount histology. Results Median DSCs for cohorts 1-3 were 0.84 (range 0.32-0.95), 0.81 (range 0.28-0.93) and 0.83 (range 0.32-0.93), respectively. Sensitivities and specificities for GTV-CNN were comparable with manual expert contours 0.98 and 0.76 (cohort 1) and 1 and 0.57 (cohort 2), respectively. Computation time was around 6 seconds for a standard dataset. Conclusion The application of a CNN for automated contouring of intraprostatic GTV in 68Ga-PSMA- and 18F-PSMA-PET images resulted in a high concordance with expert contours and in high sensitivities and specificities in comparison with histology reference. This robust, accurate and fast technique may be implemented for treatment concepts in primary PCa. The trained model and the study's source code are available in an open source repository.
Homepage: https://www.selleckchem.com/products/AZD0530.html
![]() |
Notes is a web-based application for online taking notes. You can take your notes and share with others people. If you like taking long notes, notes.io is designed for you. To date, over 8,000,000,000+ notes created and continuing...
With notes.io;
- * You can take a note from anywhere and any device with internet connection.
- * You can share the notes in social platforms (YouTube, Facebook, Twitter, instagram etc.).
- * You can quickly share your contents without website, blog and e-mail.
- * You don't need to create any Account to share a note. As you wish you can use quick, easy and best shortened notes with sms, websites, e-mail, or messaging services (WhatsApp, iMessage, Telegram, Signal).
- * Notes.io has fabulous infrastructure design for a short link and allows you to share the note as an easy and understandable link.
Fast: Notes.io is built for speed and performance. You can take a notes quickly and browse your archive.
Easy: Notes.io doesn’t require installation. Just write and share note!
Short: Notes.io’s url just 8 character. You’ll get shorten link of your note when you want to share. (Ex: notes.io/q )
Free: Notes.io works for 14 years and has been free since the day it was started.
You immediately create your first note and start sharing with the ones you wish. If you want to contact us, you can use the following communication channels;
Email: [email protected]
Twitter: http://twitter.com/notesio
Instagram: http://instagram.com/notes.io
Facebook: http://facebook.com/notesio
Regards;
Notes.io Team