Posts Tagged Journal Reference

Discovery of new white blood cell reveals target for better vaccine design


ScienceDaily (July 27, 2012) — Researchers in Newcastle and Singapore have identified a new type of white blood cell which activates a killing immune response to an external source — providing a new potential target for vaccines for conditions such as cancer or Hepatitis B.

Publishing in the journal Immunity, the team of researchers from Newcastle University in collaboration with A*STAR’s Singapore Immunology Network (SIgN) describe a new human tissue dendritic cell with cross-presenting function.

Dendritic cells (DCs) are a type of white blood cell that orchestrate our body’s immune responses to infectious agents such as bacteria and viruses, as well as cancer cells. They are also very important for eliciting the immune response generated by vaccines.

DCs kick start an immune response by presenting small fragments of molecules from micro-organisms such as bacteria and viruses, or from vaccines or tumours, called antigens on their surface. This leads to activation of another white blood cell subset called T cells, which specialise in killing cells and are crucial for eliminating cancerous or infected cells. Most cells are only able to present antigens from within themselves, and so will only elicit an immune response if they are infected themselves. Only a specialised subset of DCs is able to generate a response to an external source of antigen, for example bacteria, vaccines and tumours.

The identity of human tissue DCs that are capable of presenting external antigen to activate the cell-killing response by T cells — a process termed ‘cross-presentation’ — has remained a mystery. Their discovery, as revealed by this research, will help scientists to design better targeted vaccine strategies to treat cancer and infections such as Hepatitis B.

“These are the cells we need to be targeting for anti-cancer vaccines,” said Dr Muzlifah Haniffa, a Wellcome Trust Intermediate Fellow and Senior Clinical Lecturer at Newcastle University. “Our discovery offers an accessible, easily targetable system which makes the most of the natural ability of the cell.” The researchers also showed for the first time that dendritic cell subsets are conserved between species and have in effect created a map, facilitating the translation of mouse studies to the human immune system.

“The cross-species map is in effect a Rosetta stone that deciphers the language of mouse into human,” explains Matthew Collin, Professor of Haematology from Newcastle University.

In the paper the researchers describe how the cross-presenting DCs were first isolated from surplus plastic surgery skin which was digested to melt the gelatinous collagen to isolate the cells. This research will have significant impact on the design of vaccines and other targeted immunotherapies.

The Rosetta Stone of our immune system: Mapping Human and Mouse dendritic cells

The Newcastle University team in collaboration with A*STAR’s Singapore Immunology Network (SIgN) have for the first time ever aligned the dendritic cell subsets between mouse and humans allowing the accurate translation of mouse studies into the human model for the first time.

The researchers isolated the dendritic cells from human blood and skin and those from mouse blood, lung and liver. Using gene expression analysis, they identified gene signatures for each human dendritic cell subset. Mouse orthologues of these genes were identified and a computational analysis was performed to match subsets across species.

This provides scientists for the first time with an accurate model to compare DCs between species. Professor Matthew Collin explains: “This is in effect a Rosetta stone that deciphers the language of mouse into human. It can put into context the findings from the extensive literature using mouse models to the human settings.”

Dr. Haniffa added: “These gene signatures are available in a public repository accessible for all researchers to benefit from the data. It will allow detailed knowledge of individual human dendritic cell subsets to enable specific targeting of these cells for therapeutic strategy.”

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Newcastle University.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Muzlifah Haniffa, Amanda Shin, Venetia Bigley, Naomi McGovern, Pearline Teo, Peter See, Pavandip Singh Wasan, Xiao-Nong Wang, Frano Malinarich, Benoit Malleret, Anis Larbi, Pearlie Tan, Helen Zhao, Michael Poidinger, Sarah Pagan, Sharon Cookson, Rachel Dickinson, Ian Dimmick, Ruth F. Jarrett, Laurent Renia, John Tam, Colin Song, John Connolly, Jerry K.Y. Chan, Adam Gehring, Antonio Bertoletti, Matthew Collin, Florent Ginhoux. Human Tissues Contain CD141hi Cross-Presenting Dendritic Cells with Functional Homology to Mouse CD103 Nonlymphoid Dendritic Cells. Immunity, 2012; 37 (1): 60 DOI: 10.1016/j.immuni.2012.04.012

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

, , , , , , , ,

Leave a comment

Fluoxetine — a.k.a., Prozac — is effective as an anti-viral, study suggests


ScienceDaily (July 27, 2012) — UCLA researchers have come across an unexpected potential use for fluoxetine — commonly known as Prozac — which shows promise as an antiviral agent. The discovery could provide another tool in treating human enteroviruses that sicken and kill people in the U.S. and around the world.

Human enteroviruses are members of a genus containing more than 100 distinct RNA viruses responsible for various life threatening infections, such as poliomyelitis and encephalitis. While immunization has all but eliminated the poliovirus, the archetype for the genus, no antiviral drugs currently exist for the treatment of enterovirus infections, which are often severe and potentially fatal. In view of its favorable pharmacokinetics and safety profile of fluoxetine — which is in a class of compounds typically used in the treatment of depression, anxiety disorders and some personality disorders — the research team found that it warrants additional study as a potential antiviral agent for enterovirus infections.

Using molecular screening, the UCLA research team from the Department of Pediatrics, the California NanoSystems Institute and the Department of Molecular and Medical Pharmacology found that fluoxetine was a potent inhibitor of coxsackievirus replication. This is one of the viruses that include polio and echovirus that is found in the gastrointestinal tract. Exposure to the virus causes other opportunistic infections and diseases.

“The discovery of unexpected antiviral activity of fluoxetine is scientifically very significant and draws our attention to previously overlooked potential targets of fluoxetine and other psychogenic drugs,” said Robert Damoiseaux, scientific director of the Molecular Screening Shared Resource at the California NanoSystems Institute. “Part of our follow-up work will be the discovery of these unconventional targets for fluoxetine and other drugs of the same class and how these targets intersect with the known targets of this drug class.”

Paul Krogstad, professor of pediatrics and molecular and medical pharmacology, added that understanding the mechanisms of action of fluoxetine and norfloxetine against coxsackieviruses “will add to our understanding of enterovirus replication and lead to assessment of their potential clinical utility for the future treatment of serious enterovirus infections.”

The research team found that fluoxetine did not interfere with either viral entry or translation of the viral genome. Instead, fluoxetine and norfluoxetine markedly reduced the production of viral RNA and protein.

The study was published on July 2 in the journal of Antimicrobial Agents and Chemotherapy. Study authors also include Jun Zuo, Kevin K. Quinn, Steve Kye, and Paige Cooper from the Department of Pediatrics. The study was supported by grants from the Today’s and Tomorrow’s Children’s Fund and the UCLA Department of Pediatrics Nanopediatrics Program.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by University of California – Los Angeles. The original article was written by Jennifer Marcus.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. J. Zuo, K. K. Quinn, S. Kye, P. Cooper, R. Damoiseaux, P. Krogstad. Fluoxetine is a Potent Inhibitor of Coxsackievirus Replication. Antimicrobial Agents and Chemotherapy, 2012; DOI: 10.1128/AAC.00983-12

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

, , , , , , , , ,

Leave a comment

Tumor cells’ inner workings predict cancer progression


ScienceDaily (July 27, 2012) — Using a new assay method to study tumor cells, researchers at the University of California, San Diego School of Medicine and UC San Diego Moores Cancer Center have found evidence of clonal evolution in chronic lymphocytic leukemia (CLL). The assay method distinguishes features of leukemia cells that indicate whether the disease will be aggressive or slow-moving, a key factor in when and how patients are treated.

The findings are published in the July 26, 2012 First Edition online issue of Blood.

The progression of CLL is highly variable, dependent upon the rate and effects of accumulating monoclonal B cells in the blood, marrow, and lymphoid tissues. Some patients are symptom-free for years and do not require treatment, which involves the use of drugs that can cause significant side effects and are not curative. In other patients, however, CLL is relatively aggressive and demands therapeutic intervention soon after diagnosis.

“Our study shows that there may not be a sharp dividing line between the more aggressive and less aggressive forms of CLL,” said Thomas J. Kipps, MD, PhD, Evelyn and Edwin Tasch Chair in Cancer Research and senior author of the study. “Instead, it seems that over time the leukemia cells of patients with indolent disease begin to use genes similar to those that are generally used by CLL cells of patients with aggressive disease. In other words, prior to requiring therapy, the patterns of genes expressed by CLL cells appear to converge, regardless of whether or not the patient had aggressive versus indolent disease at diagnosis.”

Existing markers for aggressive or indolent disease are mostly fixed and have declining predictive value the longer the patient is from his or her initial diagnosis. When the blood sample is collected, these markers cannot reliably predict whether a CLL patient will need therapy soon, particularly when the patient has had the diagnosis of CLL for many years.

Kipps and colleagues studied thousands of genes, particularly those that code for proteins, in a group of 130 CLL patients with varying risks of disease progression. They identified 38 prognostic subnetworks of interacting genes and proteins that, at the time of sample collection, indicate the relative the aggressiveness of the disease and predict when the patient will require therapy. They confirmed their work using the method on two other, smaller CLL patient cohorts in Germany and Italy.

The subnetworks offer greater predictive value because they are based not on expression levels of individual genes or proteins, but on how they dynamically interact and change over time, influencing the course of the CLL and patient symptoms.

“In a sense, we looked at families rather than individuals,” said Kipps. “If you find in an interconnected family where most genes or proteins are expressed at higher levels, it becomes more likely that these genes and proteins have functional significance.”

He added that while the subnetworks abound in data, their complexity actually makes them easy to interpret and understand. “It’s like when you look out of a window and see the sky, clouds, trees, people, cars. You’re getting tremendous amounts of information that individually doesn’t tell you much. But when you look at the scene as a whole, you see patterns and networks. This work is similar. We’re taking all of the individual gene expression patterns and making sense of them as a whole. We’re more able to more clearly see how they control and regulate function.”

The findings help define how CLL — and perhaps other cancers — evolve over time, becoming more aggressive and deadly. “It’s as if each tumor has a clock which determines how frequently it may acquire the chance changes that make it behave more aggressively. Although the rates can vary, it appears that tumors march down similar pathways, which converge over time to a point where they become aggressive enough to require therapy.”

The study may alter how scientists think about CLL and how clinicians treat the disease: whether it is better to wait for later stages of the disease when tumor cells are more fragile and easier to kill, or treat early-stage indolent tumor cells aggressively, when they are fewer in number but harder to find and more resistant to therapy.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by University of California – San Diego, via EurekAlert!, a service of AAAS.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Han-Yu Chuang, Laura Rassenti, Michelle Salcedo, Kate Licon, Alexander Kohlmann, Torsten Haferlach, Robin Foà, Trey Ideker, and Thomas J. Kipps. Subnetwork-based analysis of chronic lymphocytic leukemia identifies pathways that associate with disease progression. Blood, July 26, 2012 DOI: 10.1182/blood-2012-03-416461

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

, , , , , , , , , ,

Leave a comment

Novel therapy may prevent damage to the retina in diabetic eye diseases


ScienceDaily (July 27, 2012) — Researchers at the University of Michigan Kellogg Eye Center have identified a compound that could interrupt the chain of events that cause damage to the retina in diabetic retinopathy. The finding is significant because it could lead to a novel therapy that targets two mechanisms at the root of the disease: inflammation and the weakening of the blood barrier that protects the retina.

To date, treatments for diabetic retinopathy, the leading cause of blindness among working-age Americans, have been aimed largely at one of those mechanisms.

In diabetic retinopathy, damage to the retina results, in part, from the activity of vascular endothelial growth factor (VEGF), a protein that weakens the protective blood-retinal barrier. Recent drugs targeting VEGF have exhibited good response for nearly half of the patients with diabetic retinopathy. But researchers believe that there is also an inflammatory component that may contribute to the disease process.

The study, published in the Biochemical Journal, June 2012 [epub ahead of print] identifies a specific protein common to both pathways as an important target in regulating the disease process in which blood vessels become leaky, and provides a drug that may be developed into a therapeutic intervention for patients in which anti-VEGF treatment alone is not sufficient.

“In diabetic retinopathy and a host of other retinal diseases, increases in VEGF and inflammatory factors — some of the same factors that contribute to the response to an infection — cause blood vessels in the eye to leak which, in turn, results in a buildup of fluid in the neural tissue of the retina,” says David A. Antonetti, Ph.D., Professor, Department of Ophthalmology and Visual Sciences and Molecular and Integrative Physiology, who has also been awarded a Jules and Doris Stein Professorship from Research to Prevent Blindness. “This insidious form of modified inflammation can eventually lead to blindness.”

The compound targets atypical protein kinase C (aPKC), required for VEGF to make blood vessels leak. Moreover, Antonetti’s laboratory has demonstrated that the compound is effective at blocking damage from tumor necrosis factor also elevated in diabetic retinopathy that comprises part of the inflammation. Benefits of this compound could extend to therapies for uveitis, or changes to the brain blood vessels in the presence of brain tumors or stroke.

“This is a great leap forward,” says Antonetti. “We’ve identified an important target in regulating blood vessel leakage in the eye and we have a therapy that works in animal models. Our research is in the early stages of development. We still have a long way to go to demonstrate effectiveness of this compound in humans to create a new therapy but the results are very promising.”

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by University of Michigan Health System, via Newswise.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Paul Titchenell, Cheng-Mao Lin, Jason Keil, Jeffrey Sundstrom, Charles Smith, David Antonetti. Novel Atypical PKC Inhibitors Prevent Vascular Endothelial Growth Factor-Induced Blood-Retinal Barrier Dysfunction. Biochemical Journal, 2012; DOI: 10.1042/BJ20111961

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

, , , , , , , , ,

Leave a comment

Researchers find link between childhood abuse and age at menarche


ScienceDaily (July 27, 2012) — Researchers from Boston University School of Medicine (BUSM) have found an association between childhood physical and sexual abuse and age at menarche. The findings are published online in the Journal of Adolescent Health.

Researchers led by corresponding author, Renée Boynton-Jarrett, MD, assistant professor of pediatrics at BUSM, found a 49 percent increase in risk for early onset menarche (menstrual periods prior to age 11 years) among women who reported childhood sexual abuse compared to those who were not abused. In addition, there was a 50 percent increase in risk for late onset menarche (menstrual periods after age 15 years) among women who reported severe physical abuse in childhood. The participants in the study included 68,505 women enrolled in the Nurses’ Health Study II, a prospective cohort study.

“In our study child abuse was associated with both accelerated and delayed age at menarche and importantly, these associations vary by type of abuse, which suggest that child abuse does not have a homogenous effect on health outcomes,” said Boynton-Jarrett. “There is a need for future research to explore characteristics of child abuse that may influence health outcomes including type, timing and severity of abuse, as well as the social context in which the abuse occurs.”

Child abuse is associated with a significant health burden over the life course. Early menarche has been associated with risks such as cardiovascular disease, metabolic dysfunction, cancer and depression, while late menarche has been associated with lower bone mineral density and depression.

“We need to work toward better understanding how child abuse influences health and translate these research findings into clinical practice and public health strategies to improve the well-being of survivors of child abuse,” added Boynton-Jarrett.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Boston University Medical Center, via EurekAlert!, a service of AAAS.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Renée Boynton-Jarrett, Rosalind J. Wright, Frank W. Putnam, Eileen Lividoti Hibert, Karin B. Michels, Michele R. Forman, Janet Rich-Edwards. Childhood Abuse and Age at Menarche. Journal of Adolescent Health, 2012; DOI: 10.1016/j.jadohealth.2012.06.006

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

, , , , , , , ,

Leave a comment

The longer you’re awake, the slower you get


ScienceDaily (July 27, 2012) — Anyone that has ever had trouble sleeping can attest to the difficulties at work the following day. Experts recommend eight hours of sleep per night for ideal health and productivity, but what if five to six hours of sleep is your norm? Is your work still negatively affected? A team of researchers at Brigham and Women’s Hospital (BWH) have discovered that regardless of how tired you perceive yourself to be, that lack of sleep can influence the way you perform certain tasks.

This finding is published in the July 26, 2012 online edition of The Journal of Vision.

“Our team decided to look at how sleep might affect complex visual search tasks, because they are common in safety-sensitive activities, such as air-traffic control, baggage screening, and monitoring power plant operations,” explained Jeanne F. Duffy, PhD, MBA, senior author on this study and associate neuroscientist at BWH. “These types of jobs involve processes that require repeated, quick memory encoding and retrieval of visual information, in combination with decision making about the information.”

Researchers collected and analyzed data from visual search tasks from 12 participants over a one month study. In the first week, all participants were scheduled to sleep 10-12 hours per night to make sure they were well-rested. For the following three weeks, the participants were scheduled to sleep the equivalent of 5.6 hours per night, and also had their sleep times scheduled on a 28-hour cycle, mirroring chronic jet lag. The research team gave the participants computer tests that involved visual search tasks and recorded how quickly the participants could find important information, and also how accurate they were in identifying it. The researchers report that the longer the participants were awake, the more slowly they identified the important information in the test. Additionally, during the biological night time, 12 a.m. -6 a.m., participants (who were unaware of the time throughout the study) also performed the tasks more slowly than they did during the daytime.

“This research provides valuable information for workers, and their employers, who perform these types of visual search tasks during the night shift, because they will do it much more slowly than when they are working during the day,” said Duffy. “The longer someone is awake, the more the ability to perform a task, in this case a visual search, is hindered, and this impact of being awake is even stronger at night.”

While the accuracy of the participants stayed the fairly constant, they were slower to identify the relevant information as the weeks went on. The self-ratings of sleepiness only got slightly worse during the second and third weeks on the study schedule, yet the data show that they were performing the visual search tasks significantly slower than in the first week. This finding suggests that someone’s perceptions of how tired they are do not always match their performance ability, explains Duffy.

This research was supported by NIH grant P01 AG09975 and was conducted in the BWH CCI, part of the Harvard Catalyst Clinical and Translational Science Center (UL1 RR025758-01), formerly a GCRC (M01RR02635). Development and implementation of the visual search task was supported in part by NIH grant R21 AT002571. JFD was supported in part by the BWHBRI Fund to Sustain Research Excellence; MM was supported by fellowships from the La-Roche and Novartis Foundations (Switzerland) and Jazz Pharmaceuticals (USA); SWC was supported in part by a fellowship from the Natural Sciences and Engineering Research Council of Canada.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Brigham and Women’s Hospital.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Marc Pomplun, Edward J. Silva, Joseph M. Ronda, Sean W. Cain, Mirjam Y. Münch, Charles A. Czeisler, and Jeanne F. Duffy. The effects of circadian phase, time awake, and imposed sleep restriction on performing complex visual tasks: Evidence from comparative visual search. The Journal of Vision, July 26, 2012 DOI: 10.1167/12.7.14

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

, , , , , , ,

Leave a comment

The Olympics and bare feet: What have we learned?


ScienceDaily (July 27, 2012) — Ethiopian runner Abebe Bikila made history when he earned a gold medal at the 1960 Summer Olympics in Rome. His speed and agility won him the gold, but it was barefoot running that made him a legend.

When the shoes Bikila was given for the race didn’t fit comfortably, he ditched them for his bare feet. After all, that’s the way he had trained for the Olympics in his homeland.

Racing shoeless led to success for Bikila, and now, more than 50 years later, runners are continuing to take barefoot strides. Several Olympic runners have followed Bikila and nationally the trend has exploded over the past decade. There’s even a national association dedicated to barefoot running. However, scientists are stuck on whether it either prevents or increases injuries.

“Bikila may have been on to something,” said Carey Rothschild, an instructor of physical therapy at the University of Central Florida in Orlando who specializes in orthopedic sports injuries. “The research is really not conclusive on whether one approach is better than the other. But what is clear is that it’s really a matter of developing a good running form and sticking to it, not suddenly changing it.”

Rothschild, a 12-year runner who has completed the Boston Marathon three times, reviewed research and found injuries happened with or without shoes. So she conducted a survey with the help of the Track Shack in Orlando to get to the bottom of the controversy.

What she found was striking.

Most people said they turned to barefoot running in the hopes of improving performance and reducing injuries. Ironically, those who said they never tried it avoided it for fear it would cause injuries and slow their times.

However, research shows that there are risks to running no matter what someone puts on his or her feet.

Barefoot runners tend to land on their mid or forefoot as opposed to the heel, which good athletic shoes try to cushion.

Some studies suggest that barefoot running causes a higher level of stress fractures on the front part of the foot and increased soreness in the calves. But runners who wear athletic shoes can also suffer everything from knee injuries to hip problems, related to repeated stress from impact forces at the heel.

“There is no perfect recipe,” said Rothschild, a resident of Winter Park.

In a paper publishing next month in the Journal of Strength & Conditioning Research, Rothschild reviews the research and provides a guide for those who want to explore barefoot running as a way to train for marathons. It’s a 10-12 week program that slowly eases people who run in shoes onto their bare feet.

She suggests getting a thorough physical examination and biomechanical assessment from a physical therapist or other trained professional so that strength and flexibility deficits can be identified and addressed first. That should be done before gradually transitioning to bare feet.

“The bottom line is that when a runner goes from shoes to no shoes, their body may not automatically change its gait,” Rothschild said. “But there are ways to help make that transition smoother and lower the risk of injuries.”

The researcher concludes that barefoot running in and of itself is neither good nor bad. As with running in shoes, proper training and conditioning are essential.

However, Rothschild does offer a warning.

Anyone with lower extremity or deformity or with a disease that creates a lack of sensation on the feet should probably avoid barefoot running because they can’t necessarily feel injuries resulting from running on hard surfaces.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by University of Central Florida.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Carey E. Rothschild. Primitive Running. Journal of Strength and Conditioning Research, 2012; 26 (8): 2021 DOI: 10.1519/JSC.0b013e31823a3c54

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

, , , , , , , ,

Leave a comment

OOAworld

Movie, Photos, Writing, Stories, Videos, Animation, Drawings, Art and Travel

LadyRomp

Inspirational Blog for Women

Lateral Love

"The time is always right to do what is right" ~ Martin Luther King Jr

The Curse Of The Single Parent

A little blog about the ramblings of a single parent.

cancer killing recipe

Just another WordPress.com site

lifeofbun

The bun scrolls

Blah Blah Blog

You'll thank me later

Psychological Espresso

A regular shot of psychological thought

NOM's adventures

NOM's journey through this awesome thing called life

Psychie blog

just awesome blog on mental health

Mirth and Motivation

Motivate. Elevate. Laugh. Live Positively...

Russel Ray Photos

Life from Southern California, mostly San Diego County

The Sunset Blog

Inspirational sunset & nature photos by Psychic healer Eva Tenter

Wisdom is Found Through Experience

le Silence de Sion © 2012-2014

Ray Ferrer - Emotion on Canvas

** OFFICIAL Site of Artist Ray Ferrer **

Bucket List Publications

Indulge- Travel, Adventure, & New Experiences

Tarot Salve

Any perception can connect us to reality, properly and fully. What we see doesn't have to be pretty, particularly; we can appreciate anything that exists. There is some principle of magic in everything, some living quality. Something living, something real, is taking place in everything. --Chögyam Trungpa Rinpoche