Drinking coffee associated with lower risk of death from all causes, study finds


(credit: iStock)

People who drink around three cups of coffee a day may live longer than non-coffee drinkers, a landmark study has found.

The findings — published in the journal Annals of Internal Medicine — come from the largest study of its kind, in which scientists analyzed data from more than half a million people across 10 European countries to explore the effect of coffee consumption on risk of mortality.

Researchers from the International Agency for Research on Cancer (IARC) and Imperial College London found that higher levels of coffee consumption were associated with a reduced risk of death from all causes, particularly from circulatory diseases and diseases related to the digestive tract.

“We found that higher coffee consumption was associated with a lower risk of death from any cause, and specifically for circulatory diseases, and digestive diseases,” said lead author Marc Gunter of the IARC and formerly at Imperial’s School of Public Health. “Importantly, these results were similar across all of the 10 European countries, with variable coffee drinking habits and customs. Our study also offers important insights into the possible mechanisms for the beneficial health effects of coffee.”

Healthier livers, better glucose control

Using data from the EPIC study (European Prospective Investigation into Cancer and Nutrition), the group analysed data from 521,330 people from over the age of 35 from 10 EU countries, including the UK, France, Denmark and Italy. People’s diets were assessed using questionnaires and interviews, with the highest level of coffee consumption (by volume) reported in Denmark (900 mL per day) and lowest in Italy (approximately 92 mL per day). Those who drank more coffee were also more likely to be younger, to be smokers, drinkers, eat more meat and less fruit and vegetables.

After 16 years of follow up, almost 42,000 people in the study had died from a range of conditions including cancer, circulatory diseases, heart failure and stroke. Following careful statistical adjustments for lifestyle factors such as diet and smoking, the researchers found that the group with the highest consumption of coffee had a lower risk for all causes of death, compared to those who did not drink coffee.

They found that decaffeinated coffee had a similar effect.

In a subset of 14,000 people, they also analyzed metabolic biomarkers, and found that coffee drinkers may have healthier livers overall and better glucose control than non-coffee drinkers.

According to the group, more research is needed to find out which of the compounds in coffee may be giving a protective effect or potentially benefiting health.* Other avenues of research to explore could include intervention studies, looking at the effect of coffee drinking on health outcomes.

However, Gunter noted that “due to the limitations of observational research, we are not at the stage of recommending people to drink more or less coffee. That said, our results suggest that moderate coffee drinking is not detrimental to your health, and that incorporating coffee into your diet could have health benefits.”

The study was funded by the European Commission Directorate General for Health and Consumers and the IARC.

* Coffee contains a number of compounds that can interact with the body, including caffeine, diterpenes and antioxidants, and the ratios of these compounds can be affected by the variety of methods used to prepare coffee.

Abstract of Coffee Drinking and Mortality in 10 European Countries: A Multinational Cohort Study

Background: The relationship between coffee consumption and mortality in diverse European populations with variable coffee preparation methods is unclear.

Objective: To examine whether coffee consumption is associated with all-cause and cause-specific mortality.

Design: Prospective cohort study.

Setting: 10 European countries.

Participants: 521 330 persons enrolled in EPIC (European Prospective Investigation into Cancer and Nutrition).

Measurements: Hazard ratios (HRs) and 95% CIs estimated using multivariable Cox proportional hazards models. The association of coffee consumption with serum biomarkers of liver function, inflammation, and metabolic health was evaluated in the EPIC Biomarkers subcohort (n = 14 800).

Results: During a mean follow-up of 16.4 years, 41 693 deaths occurred. Compared with nonconsumers, participants in the highest quartile of coffee consumption had statistically significantly lower all-cause mortality (men: HR, 0.88 [95% CI, 0.82 to 0.95]; P for trend < 0.001; women: HR, 0.93 [CI, 0.87 to 0.98]; P for trend = 0.009). Inverse associations were also observed for digestive disease mortality for men (HR, 0.41 [CI, 0.32 to 0.54]; P for trend < 0.001) and women (HR, 0.60 [CI, 0.46 to 0.78]; P for trend < 0.001). Among women, there was a statistically significant inverse association of coffee drinking with circulatory disease mortality (HR, 0.78 [CI, 0.68 to 0.90]; P for trend < 0.001) and cerebrovascular disease mortality (HR, 0.70 [CI, 0.55 to 0.90]; P for trend = 0.002) and a positive association with ovarian cancer mortality (HR, 1.31 [CI, 1.07 to 1.61]; P for trend = 0.015). In the EPIC Biomarkers subcohort, higher coffee consumption was associated with lower serum alkaline phosphatase; alanine aminotransferase; aspartate aminotransferase; γ-glutamyltransferase; and, in women, C-reactive protein, lipoprotein(a), and glycated hemoglobin levels.

Limitations: Reverse causality may have biased the findings; however, results did not differ after exclusion of participants who died within 8 years of baseline. Coffee-drinking habits were assessed only once.


Coffee drinking was associated with reduced risk for death from various causes. This relationship did not vary by country.

Primary Funding Source:

European Commission Directorate-General for Health and Consumers and International Agency for Research on Cancer.

Abstract of Association of Coffee Consumption With Total and Cause-Specific Mortality Among Nonwhite Populations

Background: Coffee consumption has been associated with reduced risk for death in prospective cohort studies; however, data in nonwhites are sparse.

Objective: To examine the association of coffee consumption with risk for total and cause-specific death.

Design: The MEC (Multiethnic Cohort), a prospective population-based cohort study established between 1993 and 1996.

Setting: Hawaii and Los Angeles, California.

Participants: 185 855 African Americans, Native Hawaiians, Japanese Americans, Latinos, and whites aged 45 to 75 years at recruitment.

Measurements: Outcomes were total and cause-specific mortality between 1993 and 2012. Coffee intake was assessed at baseline by means of a validated food-frequency questionnaire.

Results: 58 397 participants died during 3 195 484 person-years of follow-up (average follow-up, 16.2 years). Compared with drinking no coffee, coffee consumption was associated with lower total mortality after adjustment for smoking and other potential confounders (1 cup per day: hazard ratio [HR], 0.88 [95% CI, 0.85 to 0.91]; 2 to 3 cups per day: HR, 0.82 [CI, 0.79 to 0.86]; ≥4 cups per day: HR, 0.82 [CI, 0.78 to 0.87]; Pfor trend < 0.001). Trends were similar between caffeinated and decaffeinated coffee. Significant inverse associations were observed in 4 ethnic groups; the association in Native Hawaiians did not reach statistical significance. Inverse associations were also seen in never-smokers, younger participants (<55 years), and those who had not previously reported a chronic disease. Among examined end points, inverse associations were observed for deaths due to heart disease, cancer, respiratory disease, stroke, diabetes, and kidney disease.

Limitation: Unmeasured confounding and measurement error, although sensitivity analysis suggested that neither was likely to affect results.

Conclusion: Higher consumption of coffee was associated with lower risk for death in African Americans, Japanese Americans, Latinos, and whites.

Primary Funding Source: National Cancer Institute.

KurzweilAI » News http://ift.tt/tNEwY4

“SEO Is Always Changing”… Or Is It?: Debunking the Myth and Getting Back to Basics


Posted by bridget.randolph

Recently I made the shift to freelancing full-time, and it’s led me to participate in a few online communities for entrepreneurs, freelancers, and small business owners. I’ve noticed a trend in the way many of them talk about SEO; specifically, the blocks they face in attempting to “do SEO” for their businesses. Again and again, the concept that "SEO is too hard to stay on top of… it’s always changing" was being stated as a major reason that people feel a) overwhelmed by SEO; b) intimidated by SEO; and c) uninformed about SEO.

And it’s not just non-SEOs who use this phrase. The concept of “the ever-changing landscape of SEO” is common within SEO circles as well. In fact, I’ve almost certainly used this phrase myself.

But is it actually true?

To answer that question, we have to separate the theory of search engine optimization from the various tactics which we as SEO professionals spend so much time debating and testing. The more that I work with smaller businesses and individuals, the clearer it becomes to me that although the technology is always evolving and developing, and tactics (particularly those that attempt to trick Google rather than follow their guidelines) do need to adapt fairly rapidly, there are certain fundamentals of SEO that change very little over time, and which a non-specialist can easily understand.

The unchanging fundamentals of SEO

Google’s algorithm is based on an academia-inspired model of categorization and citations, which utilizes keywords as a way to decipher the topic of a page, and links from other sites (known as “backlinks”) to determine the relative authority of that site. Their method and technology keeps getting more sophisticated over time, but the principles have remained the same.

So what are these basic principles?

It comes down to answering the following questions:

  1. Can the search engine find your content? (Crawlability)
  2. How should the search engine organize and prioritize this content? (Site structure)
  3. What is your content about? (Keywords)
  4. How does the search engine know that your content provides trustworthy information about this topic? (Backlinks)

If your website is set up to help Google and other search engines answer these 4 questions, you will have covered the basic fundamentals of search engine optimization.

There is a lot more that you can do to optimize in all of these areas and beyond, but for businesses that are just starting out and/or on a tight budget, these are the baseline concepts you’ll need to know.


You could have the best content in the world, but it won’t drive any search traffic if the search engines can’t find it. This means that the crawlability of your site is one of the most important factors in ensuring a solid SEO foundation.

In order to find your content and rank it in the search results, a search engine needs to be able to:

  1. Access the content (at least the pages that you want to rank)
  2. Read the content

This is primarily a technical task, although it is related to having a good site structure (the next core area). You may need to adapt the code, and/or use an SEO plugin if your site runs on WordPress.

For more in-depth guides to technical SEO and crawlability, check out the following posts:

Site structure

In addition to making sure that your content is accessible and crawlable, it’s also important to help search engines understand the hierarchy and relative importance of that content. It can be tempting to think that every page is equally important to rank, but failing to structure your site in a hierarchical way often dilutes the impact of your “money” pages. Instead, you should think about what the most important pages are, and structure the rest of your site around these.

When Google and other search engine crawlers visit a site, they attempt to navigate to the homepage; then click on every link. Googlebot assumes that the pages it sees the most are the most important pages. So when you can reach a page with a single click from the homepage, or when it is linked to on every page (for example, in a top or side navigation bar, or a site footer section), Googlebot will see those pages more, and will therefore consider them to be more important. For less important pages, you’ll still need to link to them from somewhere for search engines to be able to see them, but you don’t need to emphasize them quite as frequently or keep them as close to the homepage.

The main question to ask is: Can search engines tell what your most important pages are, just by looking at the structure of your website? Google’s goal is to to save users steps, so the easier you make it for them to find and prioritize your content, the more they’ll like it.

For more in-depth guides to good site structure, check out the following posts:


Once the content you create is accessible to crawlers, the next step is to make sure that you’re giving the search engines an accurate picture of what that content is about, to help them understand which search queries your pages would be relevant to. This is where keywords come into the mix.

We use keywords to tell the search engine what each page is about, so that they can rank our content for queries which are most relevant to our website. You might hear advice to use your keywords over and over again on a page in order to rank well. The problem with this approach is that it doesn’t always create a great experience for users, and over time Google has stopped ranking pages which it perceives as being a poor user experience.

Instead, what Google is looking for in terms of keyword usage is that you:

  1. Answer the questions that real people actually have about your topic
  2. Use the terminology that real people (specifically, your target audience) actually use to refer to your topic
  3. Use the term in the way that Google thinks real people use it (this is often referred to as “user intent” or “searcher intent”).

You should only ever target one primary keyword (or phrase) per page. You can include “secondary” keywords, which are related to the primary keyword directly (think category vs subcategory). I sometimes see people attempting to target too many topics with a single page, in an effort to widen the net. But it is better to separate these out so that there’s a different page for each different angle on the topic.

The easiest way to think about this is in physical terms. Search engines’ methods are roughly based on the concept of library card catalogs, and so we can imagine that Google is categorizing pages in a similar way to a library using the Dewey decimal system to categorize books. You might have a book categorized as Romance, subcategory Gothic Romance; but you wouldn’t be able to categorize it as Romance and also Horror, even though it might be related to both topics. You can’t have the same physical book on 2 different shelves in 2 different sections of the library. Keyword targeting works the same way: 1 primary topic per page.

For more in-depth guides to keyword research and keyword targeting, check out the following posts:


Another longstanding ranking factor is the number of links from other sites to your content, known as backlinks.

It’s not enough for you to say that you’re the expert in something, if no one else sees it that way. If you were looking for a new doctor, you wouldn’t just go with the guy who says “I’m the world’s best doctor.” But if a trusted friend told you that they loved their doctor and that they thought you’d like her too, you’d almost certainly make an appointment.

When other websites link to your site, it helps to answer the question: “Do other people see you as a trustworthy resource?” Google wants to provide correct and complete information to people’s queries. The more trusted your content is by others, the more that indicates the value of that information and your authority as an expert.

When Google looks at a site’s backlinks, they are effectively doing the same thing that humans do when they read reviews and testimonials to decide which product to buy, which movie to see, or which restaurant to go to for dinner. If you haven’t worked with a product or business, other people’s reviews point you to what’s good and what’s not. In Google’s case, a link from another site serves as a vote of confidence for your content.

That being said, not all backlinks are treated equally when it comes to boosting your site’s rankings. They are weighted differently according to how Google perceives the quality and authority of the site that’s doing the linking. This can feel a little confusing, but when you think about it in the context of a recommendation, it becomes a lot easier to understand whether the backlinks your site is collecting are useful or not. After all, think about the last time you saw a movie. How did you choose what to see? Maybe you checked well-known critics’ reviews, checked Rotten Tomatoes, asked friends’ opinions, looked at Netflix’s suggestions list, or saw acquaintances posting about the film on social media.

When it comes to making a decision, who do you trust? As humans, we tend to use an (often unconscious) hierarchy of trust:

  1. Personalized recommendation: Close friends who know me well are most likely to recommend something I’ll like;
  2. Expert recommendation: Professional reviewers who are authorities on the art of film are likely to have a useful opinion, although it may not always totally match my personal taste;
  3. Popular recommendation: If a high percentage of random people liked the movie, this might mean it has a wide appeal and will likely be a good experience for me as well;
  4. Negative association: If someone is raving about a movie on social media and I know that they’re a terrible human with terrible taste… well, in the absence of other positive signals, that fact might actually influence me not to see the movie.

To bring this back to SEO, you can think about backlinks as the SEO version of reviews. And the same hierarchy comes into play.

  1. Personalized/contextual recommendation: For local businesses or niche markets, very specific websites like a local city’s tourism site, local business directory or very in-depth, niche fan site might be the equivalent of the “best friend recommendation”. They may not be an expert in what everyone likes, but they definitely know what works for you as an individual and in some cases, that’s more valuable.
  2. Expert recommendation: Well-known sites with a lot of inherent trust, like the BBC or Harvard University, are like the established movie critics. Broadly speaking they are the most trustworthy, but possibly lacking the context for a specific person’s needs. In the absence of a highly targeted type of content or service, these will be your strongest links.
  3. Popular recommendation: All things being equal, a lot of backlinks from a lot of different sites is seen as a signal that the content is relevant and useful.
  4. Negative association: Links that are placed via spam tactics, that you buy in bulk, or that sit on sites that look like garbage, are the website equivalent of that terrible person whose recommendation actually turns you off the movie.

If a site collects too many links from poor-quality sites, it could look like those links were bought, rather than "earned" recommendations (similar to businesses paying people to write positive reviews). Google views the buying of links as a dishonest practice, and a way of gaming their system, and therefore if they believe that you are doing this intentionally it may trigger a penalty. Even if they don’t cause a penalty, you won’t gain any real value from poor quality links, so they’re certainly not something to aim for. Because of this, some people become very risk-averse about backlinks, even the ones that came to them naturally. But as long as you are getting links from other trustworthy sources, and these high quality links make up a substantially higher percentage of your total, having a handful of lower quality sites linking to you shouldn’t prevent you from benefiting from the high quality ones.

For more in-depth guides to backlinks, check out the following posts:

Theory of Links

Getting More Links

Mitigating Risk of Links

Does anything about SEO actually change?

If SEO is really this simple, why do people talk about how it changes all the time? This is where we have to separate the theory of SEO from the tactics we use as SEO professionals to grow traffic and optimize for better rankings.

The fundamentals that we’ve covered here — crawlability, keywords, backlinks, and site structure — are the theory of SEO. But when it comes to actually making it work, you need to use tactics to optimize these areas. And this is where we see a lot of changes happening on a regular basis, because Google and the other search engines are constantly tweaking the way the algorithm understands and utilizes information from those four main areas in determining how a site’s content should rank on a results page.

The important thing to know is that, although the tactics which people use will change all the time, the goal for the search engine is always the same: to provide searchers with the information they need, as quickly and easily as possible. That means that whatever tactics and strategies you choose to pursue, the important thing is that they enable you to optimize for your main keywords, structure your site clearly, keep your site accessible, and get more backlinks from more sites, while still keeping the quality of the site and the backlinks high.

The quality test (EAT)

Because Google’s goal is to provide high-quality results, the changes that they make to the algorithm are designed to improve their ability to identify the highest quality content possible. Therefore, when tactics stop working (or worse, backfire and incur penalties), it is usually related to the fact that these tactics didn’t create high-quality outputs.

Like the fundamentals of SEO theory which we’ve already covered, the criteria that Google uses to determine whether a website or page is good quality haven’t changed all that much since the beginning. They’ve just gotten better at enforcing them. This means that you can use these criteria as a “sniff test” when considering whether a tactic is likely to be a sustainable approach long-term.

Google themselves refer to these criteria in their Search Quality Rating Guidelines with the acronym EAT, which stands for:

  • Expertise
  • Authoritativeness
  • Trustworthiness

In order to be viewed as high-quality content (on your own site) or a high-quality link (from another site to your site), the content needs to tick at least one of these boxes.


Does this content answer a question people have? Is it a *good* answer? Do you have a more in-depth degree of knowledge about this topic than most people?

This is why you will see people talk about Google penalizing “thin” content — that just refers to content which isn’t really worth having on its own page, because it doesn’t provide any real value to the reader.


Are you someone who is respected and cited by others who know something about this topic?

This is where the value of backlinks can come in. One way to demonstrate that you are an authority on a topic is if Google sees a lot of other reputable sources referring to your content as a source or resource.


Are you a reputable person or business? Can you be trusted to take good care of your users and their information?

Because trustworthiness is a factor in determining a site’s quality, Google has compiled a list of indicators which might mean a site is untrustworthy or spammy. These include things like a high proportion of ads to regular content, behavior that forces or manipulates users into taking actions they didn’t want to take, hiding some content and only showing it to search engines to manipulate rankings, not using a secure platform to take payment information, etc.

It’s always the same end goal

Yes, SEO can be technical, and yes, it can change rapidly. But at the end of the day, what doesn’t change is the end goal. Google and the other search engines make money through advertising, and in order to get more users to see (and click on) their ads, they have to provide a great user experience. Therefore, their goal is always going to be to give the searchers the best information they can, as easily as they can, so that people will keep using their service.

As long as you understand this, the theory of SEO is pretty straightforward. It’s just about making it easy for Google to answer these questions:

  1. What is your site about?
    1. What information does it provide?
    2. What service or function does it provide?
  2. How do we know that you’ll provide the best answer or product or service for our users’ needs?
  3. Does your content demonstrate Expertise, Authoritativeness, and/or Trustworthiness (EAT)?

This is why the fundamentals have changed so little, despite the fact that the industry, technology and tactics have transformed rapidly over time.

A brief caveat

My goal with this post is not to provide step-by-step instruction in how to “do SEO,” but rather to demystify the basic theory for those who find the topic too overwhelming to know where to start, or who believe that it’s too complicated to understand without years of study. With this goal in mind, I am intentionally taking a simplified and high-level perspective. This is not to dismiss the importance of an SEO expert in driving strategy and continuing to develop and maximize value from the search channel. My hope is that those business owners and entrepreneurs who currently feel overwhelmed by this topic can gain a better grasp on the way SEO works, and a greater confidence and ease in approaching their search strategy going forward.

I have provided a few in-depth resources for each of the key areas — but you will likely want to hire a specialist or consultant to assist with analysis and implementation (certainly if you want to develop your search strategy beyond simply the “table stakes” as Rand calls it, you will need a more nuanced understanding of the topic than I can provide in a single blog post).

At the end of the day, the ideas behind SEO are actually pretty simple — it’s the execution that can be more complex or simply time-consuming. That’s why it’s important to understand that theory — so that you can be more informed if and when you do decide to partner with someone who is offering that expertise. As long as you understand the basic concepts and end goal, you’ll be able to go into that process with confidence. Good luck!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moz Blog https://moz.com/blog

Facebook will allow paywalled Instant Articles later this year


Facebook is adding the option for publishers to lock Instant Articles shared on the social media service behind a paywall later this year, said Campbell Brown, the company’s head of news services, at a conference yesterday according to The Street.

Publishers will be able to lock out users after 10 articles in a month

The new feature will reportedly be built on top of the company’s existing Instant Articles feature. Publishers will be able to direct users to sign up for a digital subscription or to lock out readers after they’ve read 10 articles in a month, similar to the limitations that publications like The New York Times already place on their websites.

According to Brown, the new subscription option is in response to requests from publishers who have been lobbying the company for a paywall for shared articles on Facebook. The new subscription service will reportedly begin tests in October, although details are still slim — including whether or not Facebook will share in the revenue from subscriptions purchased through the application.

The Verge http://ift.tt/1jLudMg

I went to San Diego Comic-Con and ended up in Westworld


HBO’s series Westworld has sparked some useful cultural shorthand for describing any kind of live entertainment where participants directly engage with actors and environments to create their own story. A Disney resort where people can have their own Star Wars adventures? That’s a Westworld. An 11-month-long immersive experience where participants try to save a young woman from a supernatural cult? That’s a Westworld, too. So perhaps it shouldn’t come as a total shock that when it came time to promote its own show, HBO decided to just bring the real thing to San Diego Comic-Con International.

The goal of Westworld: The Experience is exactly that: give visitors an idea of what it would feel like to step inside the television show itself. Designed by New York-based marketing agency Campfire, it doesn’t bother with virtual reality or other technological gimmickry. It’s a tactile, human experience, relying on sets, props, and a cast of wonderfully dedicated actors to work its magic.

The journey actually begins at a hotel next to the convention center. There, a player piano plinks out the various rock covers and themes from the show, while a massive sign for the Westworld park — A Delos Destination, it specifies — hangs nearby. The mood is instantly set by two Delos representatives, clad head-to-toe in white, who help guests sign up for their visits later in the day.

When I arrived for my appointment at the actual venue (it’s nestled in San Diego’s nearby Gaslamp Quarter), my group was greeted by another host, who guided us upstairs into the Delos offices. And when I say “the Delos offices,” I mean it felt exactly like stepping into one of the company’s offices, as seen on the show. Behind a frosted glass door, a large video screen played the Westworld promotional reel, and another group of hosts showed us the clothing, weaponry, and costumes used in the park.

These weren’t props; they were items being used in the actual Westworld resort

Bringing out props and costumes is a go-to for the promotional installations that pop up at places like Comic-Con and SXSW. But there was already an important distinction setting this experience apart. We weren’t being showed costumes from the show Westworld. These hosts were showing us items that were being used at Westworld, the resort. It was a subtle distinction — a signal that we were having an “in-show” experience rather than a promotional one — but it set the tone for what was to come.

A table filled with guns and knives dominated the room, and as I peered closer, one of the hosts asked me which weapon I preferred. His performance was so committed, so perfectly in sync with the hosts in the series, that I found myself playing along earnestly. I told him I would have normally picked a gun, but that the large bowie knife had caught my eye. Then, running with the moment, I presented him with a real concern: “If I kill a host in the park, will it feel like killing a person?” I asked.

He leaned in. “If you can’t tell the difference, does it matter?”

Now, I know he was riffing on a line from the show. What I don’t know is whether that was a stock rejoinder the actor had ready for this kind of exchange, or a moment of pure improvisation. Either way, it didn’t matter. It was a pitch-perfect Westworld response, with even the actor’s delivery filled with moral ambiguity and a dark call to action. And I was completely hooked.

“If I kill a host in the park, will it feel like killing a person?”

My group of six were slowly split up and taken to processing rooms, where we would each have one-on-one evaluations to determine whether we were mentally fit for what was ahead. In the hallways, the attention to detail was remarkable for this kind of event. Every single door had a specific label and purpose; when I asked about the pained moans coming from behind a door marked “R&D,” our host had a ready response. “That’s our research and development lab,” he said. “But this trip, we’re focusing on Westworld.” A reference to the slaughter at the end of the first season? A tease for what’s to come? It could have been either, but every moment and piece of signage had been considered, perpetuating the illusion of being wrapped inside the show.

As for the evaluation process itself, I can only say it was an example of how effective interacting with a good actor can be. I was asked a series of questions: what percentage of my dreams I would characterize as nightmares? If I had to lose a finger, which would I pick? They all had a vaguely Voight-Kampff feel, but it was the performer, reacting to my responses in real time while nailing the slightly detached, perfectly pleasant vocal characteristics of a Delos host, that brought it to life.

For my final question, I was presented with a dilemma: I’m in a saloon with five other people when bandits burst in and kill everyone. The outlaws offer me a gun, and the opportunity to join them, leave, or fight back. What would I do?

I don’t want to reveal my exact answer, but when it came time for the host to choose either a white or a black hat for me, she opted for the latter option.

My group reconvened and we were shown a brief promo video — this was actually the weakest part of the experience, because it broke the illusion everyone else had been so busy creating — and then we walked down a long hallway toward a single door. It was the entrance to the park itself: the Mariposa Saloon.

At this point, the experience essentially turned into a low-key, themed cocktail party. An actor at the bar played a riff on Thandie Newton’s character, Maeve Millay, while another player piano provided the musical accompaniment. Behind the bar, the bartender (a Chicago-based mixologist named Paul McGee) prepared three different speciality cocktails for my group. It wasn’t exactly authentic to the saloon in the show — the real Westworld usually comes across as a “straight whiskey or nothing” kind of place — but the drinks added an air of high-end trendiness that did seem like it would be in line with Delos’ clientele.

As my group started talking among ourselves, we compared notes about the various answers we’d given our different hosts. That’s when I realized the final question suggested a scenario where I would be in a saloon with five other people when violence broke out. And there I was, standing in a saloon with five other people.

That violence never materialized. The experience ultimately petered out with a Delos host entering the saloon, ushering us back into the real world and out of the company’s offices. But for 30 minutes or so, Westworld: The Experience actually did an incredibly good job of opening the door into a fictional world and ushering my group inside. Comic-Con has only just begun, but I’m already comfortable saying that the Westworld experience will be one of this year’s highlights for me — if not the highlight.

Easily one of the highlights of Comic-Con

Brand activations and promotional experiences are never-ending at place like Comic-Con, and they often get fans in the doors by relying on whatever new technology trend is emerging at a given moment. It’s how you end up with conventions full of mediocre Samsung Gear VR movie tie-ins. But HBO and Campfire leaned away from from that trend, focusing instead on real, human interactions. The results aren’t just fun, they’re aligned with the themes of the shows being promoted.

With the rise of theme parks, escape rooms, and immersive theater, it’s clear that audiences have an increasing appetite for entertainment experiences that go beyond passive screen-watching or traditional gaming. The popularity of the Westworld show itself might be further proof, and with massive corporations like Disney embracing the trend to lure audiences away from their living rooms, these kinds of projects will only become more ubiquitous. What’s wonderful about something like Westworld: The Experience is that it’s a perfect gateway drug: a way for people who have never seen a piece of immersive entertainment to kick the tires and see what it’s like to be present in a fictional world, or play a scene opposite an actor.

The only downside is that the installation will only be running through this Sunday at Comic-Con. Westworld: The Experience could easily be remounted in different locations in the future, and HBO does have a history of touring things like the escape room installation it brought to SXSW this year. But if you’re at Comic-Con and enjoy Westworld at all, don’t think twice: go.

Photography by Bryan Bishop / The Verge

The Verge http://ift.tt/1jLudMg

Verizon admits to throttling video in apparent violation of net neutrality


Yesterday, we reported that Verizon Wireless appeared to be throttling Netflix traffic, — and today, the company seems to have come clean. In a statement provided to Ars Technica and The Verge, Verizon implicitly admitted to capping the traffic, blaming the issue on a temporary video optimization test.

“We’ve been doing network testing over the past few days to optimize the performance of video applications on our network," a Verizon Wireless spokesperson said. “The testing should be completed shortly. The customer video experience was not affected.”

This is a really weird statement, seemingly referring to something completely different from what customers actually experienced. What customers saw wasn’t optimization, but a clear cap, with tests from Netflix’s speed-test tool showing measurably lower rates than non-Netflix tests.

While Netflix was the only service to have a speed-test tool producing measurements, it now appears that similar caps were applied to all video applications on the Verizon Wireless network.

A subsequent statement from a Verizon representative took issue with this article, calling it “dead wrong” and saying that it “makes no sense.”

“We are constantly testing the network,” the representative said. “It’s what we do, to optimize performance for our customers. The test was across the board, and did not target any individual applications.”

At the same time, the representative confirmed that a 10Mbps cap was in place for some users. “The consumer video experience should have been unaffected by the test,” the representative wrote, “since 1080p video is HD quality and looks great at 10 [Mpbs].”

It’s true that, as we pointed out in our initial article, many users would not be able to perceive a 10Mbps limit on video speeds. Still, those clarifications seem consistent with an across-the-board throttle on video applications, put in place without any disclosure to customers. If that’s what Verizon means by optimization, then it looks an awful lot like the throttling scenarios net neutrality advocates have been warning about for years.

It’s worth remembering that Title II is still officially the law of the land, and although the FCC is doing its best to roll it back, Verizon Wireless is still legally a common carrier regulated under Title II, which means it’s obligated to treat all traffic equally. There are some exceptions to that for network management, but throttling a specific service is a textbook violation of those rules. Netflix traffic was clearly, tangibly being treated differently from other traffic, and customers hadn’t opted into any special service like Go90 that might justify it.

We asked Verizon whether they believe these tests are in violation of Title II. We’ll update with any response.

Update July 21st, 2:40PM ET: Updated with subsequent statement from Verizon. Because the statement indicated the limits were applied to all video applications, we’ve updated the headline from “throttling Netflix” to “throttling video.”

The Verge http://ift.tt/1jLudMg

Scientists are now using Wi-Fi to read human emotions


Scientists at MIT are using Wi-Fi and AI to determine your emotional state. They’ve created an algorithm that can detect and measure individual heartbeats by bouncing RF signals off of people. An RF emitter coupled with the algorithm works in the same way as an electrocardiogram (EKG/ECG), without requiring any leads be attached to a person. This is accomplished using the same technology that we currently have in our home routers. The remarkable part is the machine-learning that goes into what the scientists are calling EQ Radio. The information the AI receives has to be processed differently than a standard…

This story continues at The Next Web

The Next Web https://thenextweb.com

Facebook gives super-fans a home with new groups feature


Facebook is rolling out a new feature called “Groups for Pages,” which allows people who run Facebook Pages to create sub-groups within them. These groups will let dedicated users chat to one another, and to the owners of the page directly. They’ll provide a space for discussion of niche or specialized topics, with Chris Cox, chief product office at Facebook, comparing them to fan clubs.

“If you are an artist, a business, a brand, or a newspaper, you can now create fan clubs and groups centered around your super-fans,” said Cox in a Facebook post.

Cox says the feature came about when The Washington Post’s digital and comment editors, Terri Rupar and Teddy Amenabar, started a group called PostThis – from The Washington Post. The group allowed reporters to speak directly to passionate readers of the paper and explain how stories came together. Cox compares this to a “digital version of letters to the editor, but with ongoing real-time discussions.”

The new feature also means that brands can create their own groups without having to use employees’ personal accounts, which helps protect their privacy.

In a separate post, Facebook founder Mark Zuckerberg said he hoped the new feature will help more people join meaningful communities. He highlighted one page, AddictionUnscripted.com, which has a separate support group for people affected by addiction. It already has a membership of 45,000 people.

The AddictionUnscripted.com Facebook page

Photo: Facebook

“In today’s world, we all get support from a few sources: our family and friends, our communities, and our social safety net,” wrote Zuckerberg. “In our civic discussion we most often focus on our social safety net, but I’ve found that our communities are often just as important for taking care of us, and we need to focus just as much time on building them.”

Facebook says there are currently 70 million pages that represent artists, businesses, brands, and newspapers across the social networking site. That’s a lot of potential groups — and a lot of discussion to be had.

The Verge http://ift.tt/1jLudMg

Facebook’s built-in camera does GIFs now


Facebook has been building out the camera in its main app for a while now, and today is adding a new GIF function. As first spotted by The Next Web, you can access the feature by tapping the camera icon in the top left of the app. Just swipe right to start making quick GIFs.

The function is a bit of a mixed bag. It works well enough, and you can add a bunch of different frames, and filters (including some cool Prisma-esque style transfer effects). But you can only share the resulting GIFs on your Facebook story or as posts on your Facebook page. You can’t send them to other services, and you can only save them as videos. Which means they’re not much use outside of Facebook.

A screenshot of Facebook’s camera showing the GIF option at the top.

You can see how the GIF option looks in the Facebook camera above. (Also, for fans of meta: this is a screenshot of the Facebook app in which I’m taking a photo of a video of a GIF on my computer which I originally took on the Facebook app. Too many screens in our lives, folks, too many screens.) Anyway, it seems Facebook is rolling this out slowly, as not all Verge staff could access the new feature. Have a look for yourself, and maybe GIF your cat if you feel like it.

The Verge http://ift.tt/1jLudMg

Ray Kurzweil: A Strategy for Keeping the World Safe From AI


Ray Kurzweil is an inventor, thinker, and futurist famous for forecasting the pace of technology and predicting the world of tomorrow. In this video, Kurzweil discusses AI ethics and risk. Films and books have long described a rogue superintelligent AI bent on our destruction, he says.

But it won’t be a pop culture future. Even today, there aren’t one or two AIs in the world, there are billions in our pockets in continuous communication with the cloud. Human civilization and AI are already “very deeply interwoven” and the connection will only increase, according to Kurzweil.

There is risk, and it’s good we’re beginning to do something about it. But future AI will grow from and be an integral part of society. Which means the kind of AI we develop will be very much up to us.

Image Credit: NASA

Singularity Hub http://ift.tt/1EMh4iX

Ads are coming to Facebook Messenger’s home screen


Facebook must really like the revenue that it’s getting from ads on the Facebook Messenger home screen, because after a limited beta in Thailand and Australia, the company has decided to roll out the advertising option worldwide. According to VentureBeat, you’ll likely see the flow of your home screen broken up by sponsored ads before the end of 2017. And just like the ads you see on Facebook itself and Instagram, they’ll be targeted and specific to you.

Aside from the home screen placement, businesses on Messenger can already send sponsored messages to users that have messaged their business previously. “[Advertising is] not necessarily everything, but it’s definitely how we’re going to be making money right now,” Messenger head of product Stan Chudnovsky told VentureBeat. “There are some other business models we are exploring as well, but they’re all around ads one way or another.” Lovely. I’m sure the brands are salivating. Back in May, Facebook revamped the Messenger home screen to again focus on chat and communication; other features like Games were shifted to different tabs.

Chudnovsky said that Facebook will “start slow” with the home screen ads in Messenger. “When the average user can be sure to see them we truly don’t know because we’re just going to be very data-driven and user feedback-driven on making that decision.” So maybe if enough people totally hate them, Facebook will stick to its other Messenger ad options.

And if not, well, remember that you can install Messenger Lite on Android to hopefully avoid a lot of this.

The Verge http://ift.tt/1jLudMg