Monday, January 25, 2016

Do You Really Need a $10,000 Toilet?

1 – Toto Neorest 750H
Toto Neorest 750H
You may not “need” a $10,000 toilet, but once you’ve experienced a Toto Neorest 750H, you will aspire to own one. I spent the week at the Kitchen & Bath Industry Show (KBIS), which is presented in conjunction with the National Kitchen and Bath Association (NKBA), and it was awesome! The $31 billion kitchen and bath industry brings together the world’s best designers, craftspeople and technologists in ways that truly impact our daily lives – up to and including high-tech toilets that keep your bottom warm and clean and practically clean themselves. Here are six things that got my undivided attention.

Shelly Palmer and the Neorest 750H

Yes, that’s me staring into a $10,000 toilet. All kidding aside, the engineers at Toto have taken toilet technology to a whole new level. The Neorest 750H has several remarkable features. There’s an integrated UV light in the lid that combines with a special glaze to help break down organic material. It automatically rinses the bowl with antibacterial electrolyzed water, and the results are an almost self-cleaning toilet. Then, there are the fun features like the heated seat, deodorizer and warm water sprays – which require a detailed explanation that I will not provide here.

2 – 3D-Printed Faucets from DXV by American Standard

I’m a big fan of 3D printing (additive manufacturing) and the practically infinite manufacturing opportunities it empowers. You can 3D print in almost any material. Designer Jean-Jacques L’Henaff from LIXIL Water Technology has teamed up with American Standard to demonstrate the power and potential of the art form. Today, this is an objet d’art, the only advantage is the 3D-centric design. But soon, this technology will be pervasive and it is likely have a dramatic impact on the way we shop (and manufacture) kitchen and bath hardware. While these 3D-printed faucets cost approximately $19,000 in stainless steel, for $50 more you can print them in titanium. Who wouldn’t want a 3D-printed titanium faucet for $19,050?

3 – Kalamazoo Hybrid Fire Grill
Kalamazoo Hybrid Grill

If you are a true grill master or just a weekend family griller, you will lose your mind when you see the Kalamazoo Hybrid Fire Grill in action. It allows you to use any combination of gas, wood and charcoal to create the perfect fire at the perfect temperature for anything you might want to grill. The wood/charcoal drawer makes refueling easy, and the grill is engineered to be very easy to clean. I am a particular fan of the different types of grates (meat, fish, veggies) that are available. I was hugely impressed by the entire line of Kalamazoo Outdoor Gourmet’s products, but the Hybrid Fire Grill just made me smile.

4 – Monogram Indoor Pizza Oven
Monogram Indoor Pizza Oven
When you grow up in New York, pizza is a staple food. Over the years, gourmet pizza recipes have come into their own, and nowadays you can put anything on a pizza crust and call it a pizza. Purists and adventurers alike will instantly fall in love with the Monogram Indoor Pizza Oven. This wall-mounted indoor oven takes only 25 minutes to reach 725 degrees Fahrenheit (the correct temperature for gourmet pizza). Then, you’re two minutes away from all the amazing pizza combinations you can concoct. You may be put off by the $9,900 price tag – but great pizza is worth it!

5 – MoistureShield® Composite Decking

I don’t think I was prepared for the state-of-the art in composite decking material, and I was beyond impressed by what some of the world’s best designers were capable of doing with it. In practice, you can use composite decking materials just as you would use wood. MoistureShield is a green product using 95 percent recycled material, and the company’s non-capped products look and feel better than almost everything else I’ve seen. While MoistureShield has competition from Trex®, TimberTech®, EverGrain® and Fiberon®, MoistureShield’s price, performance and warranty make it stand out.

6 – Boral Versetta Stone®

 Share         WHERE TO BUY Boral Versetta Stone®
Cultured stone veneers have been around for a long time. Most products require extremely talented professionals to install them, and as far as aesthetics are concerned, only you can judge – it’s a matter of personal taste. What caught my attention about Boral Versetta Stone were the unique form factor, the mounting system and how incredibly easy it is to install. It’s packaged in 17-lb. sheets that can be installed with mechanical fasteners. Basically, anyone with a level, a hammer and a sense of what a stud is can do this by themselves. If you have a project (indoor or outdoor) that needs this kind of “stone mason” look and feel, Boral Versetta Stone is worth considering.

Friday, January 22, 2016

The Social Security Administration (SSA) is looking for two executives

The Social Security Administration (SSA) is looking for two executives to lead its Office of Income Security Programs (OISP) as the Associate Commissioner (AC) and Deputy Associate Commissioner (DAC) responsible for Retirement and Survivors Insurance (RSI) and Supplemental Security Income (SSI) policy.  The AC and DAC provide executive direction to an organization with approximately 125 employees and are responsible for ensuring programs under its purview are administered in an effective and efficient manner and align with the SSA mission and strategic objectives.  The AC will report directly to the Deputy Commissioner and Assistant Deputy Commissioner for Retirement and Disability Policy, the DAC will report directly to the AC; both positions are located at the agency’s Headquarters campus in Woodlawn, Maryland.

The positions carry responsibility for a wide range of Social Security and SSI policies regarding eligibility and benefit amounts, such as earnings and coverage requirements, defining marriage and other family relationships, SSA’s role in Medicare, and implementing provisions of the recent Bipartisan Budget Act. Specific topics include: Social Security numbers, applications and appeals, due process, and administrative finality; wages, coverage and exceptions, self-employment income, retirement earnings test, State and local coverage, windfall elimination provisions and government pension offset; SSI income, resources, and in-kind support and maintenance; preventing/reducing improper payments; representative payees; and Agency notice improvement activities.

The applicants should have a broad knowledge of Social Security laws, regulations, and policies, senior level experience developing, analyzing and/or interpreting public policy for social insurance and income maintenance programs in the public or private sector, and demonstrated senior level experience in planning, directing, managing and overseeing a large, fast-paced organization.  Activities also include human resources management, collaboration with multiple internal and external customers, knowledge of congressional legislation, assessing political and operational impact of decisions with a high degree of quality, and providing first class service to the public being served.

The complete vacancy announcement can be found on USAJOBS; direct link below.  Interested applicants may apply to either or both of the vacancies:

Associate Commissioner                   SSA-EX-463  
Deputy Associate Commissioner       SSA-EX-464  

Applicants should address the leadership and technical requirements of the position, describe work accomplishments, how they have exercised leadership to deliver significant results, experience collaborating/communicating and cooperating to achieve goals, and experience leading strategic change and overcoming obstacles to effectuate those changes. 

J. Jioni Palmer
Associate Commissioner for External Affairs
(T) 410-965-1804

Thursday, January 21, 2016

Cisco's 2016 Security Report: Attacks getting stronger, defender confidence dropping

On Tuesday, Cisco recently released the 2016 version of Annual Security Report on cybersecurity trends and challenges. Here are the highlights.

On Tuesday, January 19, Cisco released its 2016 Annual Security Report, highlighting the progression of cybersecurity and what businesses can expect as they move more into the new year. The full report (with appendix) is 87 pages long, so we'll give you the highlights.

One of the top findings from this year's report was that defender confidence is dropping, with only 45% of global organizations worldwide confident in their security relative to today's threats. However, many executive said they expect greater transparency on security in the future. According to a company press release: "This points to security as a growing boardroom concern."

Still, these growing concerns are acting as a catalyst for organizations to improve their security practices, as they know where their weaknesses are and what they need to work on.

SEE: Hackers' modus operandi: 5 insights that may help identify emerging threats

Aging infrastructure also played a role in poor security posture with 92% of internet devices operating with known vulnerabilities. Jason Brvenik, principal engineer for the Security Business Group at Cisco, said that some were running with up to 26 vulnerabilities. Additionally, 31% of devices are running with no vendor support.

"The second highest barrier to adopting advanced security practices and technologies are compatibility issues," Brvenik said.

Cisco's report also identified another, relational threat to enterprises—SMBs. Based on the report's finding, SMBs use fewer tools to identify and defend against security threats. These "structural weaknesses" present a potential risk to enterprises that may be working with SMBs in some capacity.

However, SMBs are improving their security due, in part, to outsourcing security services. All in all, outsourcing security is on the rise across the board with more than half of all larger organizations outsourcing consulting services, as well as a good number of businesses outsourcing auditing, monitoring, incident response, and more.

With all the changes in the way security is handled on the business side of things, it begs the question of where the major threats are happening now and where they'll come from in the future.

For one, social media platforms are growing as a foundation for criminals to carry out their campaigns—especially when it comes to compromised servers, like those for WordPress. Between February and October 2015, the number of WordPress domains that were being used by cybercriminals grew by a staggering 221%, according to the report.

Another growing risk is the browser—specifically, malicious browser extensions that have impacted more than 85% in terms of data leakage. Craig Williams, senior technical leader at Cisco, said that he wasn't surprised by this number, though.

"The fact of the matter is, these days, if you're not patching your browser and if you're not patching the plugins, you're going to be attacked by a massive number of threats," Williams said.

However, Williams said, there are legitimate reasons to not patch a browser, such as applications that require certain library versions. But, in this day and age, there are so many options for web browsers that there isn't really an excuse.

Being that most cybersecurity issues involve the internet, of course Cisco had to take a look at DNS risks. Of "known bad" malware, the Cisco report found that almost 92% used DNS to carry out their campaigns. While he wasn't totally surprised by the number, Williams said he would have originally guessed it to be closer to 85%.

HTTPS encrypted traffic is growing and, based on what Cisco observed in 2015, they believe it will soon become the leading form of traffic online. And, while that may seem like a good thing on the surface, it could introduce other problems.

"Although encryption can help protect consumers, it also can undermine the effectiveness of security products, making it more difficult for the security community to track threats," the report said. "Adding to the challenge, some malware may initiate encrypted communications across a diverse set of ports."

The three big takeaways

  • Attacks are increasing and organization are losing confidence in their ability to stop them, which could serve as a catalyst for greater investments in security and greater demand for third party and cloud security services.
  • SMBs have particular risks, and so do the larger organizations that partner with them.
  • The rise of HTTPS to secure web traffic offers new protections, but it's not a silver bullet and could be co-opted by attackers to better cover their tracks.

Tuesday, January 19, 2016

The Unreasonable Reputation of Neural Networks (Shelly Palmer)

January 12, 2016

It is hard not to be enamoured by deep learning nowadays, watching neural networks show off their endless accumulation of new tricks. There are, as I see it, at least two good reasons to be impressed:
(1) Neural networks can learn to model many natural functions well, from weak priors.
The idea of marrying hierarchical, distributed representations with fast, GPU-optimised gradient calculations has turned out to be very powerful indeed. The early days of neural networks saw problems with local optima, but the ability to train deeper networks has solved this and allowed backpropagation to shine through. After baking in a small amount of domain knowledge through simple architectural decisions, deep learning practitioners now find themselves with a powerful class of parameterised functions and a practical way to optimise them.
The first such architectural decisions were the use of either convolutions or recurrent structure, to imbue models with spatial and temporal invariances. From this alone, neural networks excelled in image classification, speech recognition, machine translation, atari games, and many more domains. More recently, mechanisms for top-down attention over inputs have shown their worth in image and natural language tasks, while differentiable memory modules such as tapes and stacks have even enabled networks to learn the rules of simple algorithms from only input-output pairs.
(2) Neural networks can learn surprisingly useful representations
While the community still waits eagerly for unsupervised learning to bear fruit, deep supervised learning has shown an impressive aptitude for building generalisable and interpretable features. That is to say, the features learned when a neural network is trained to predict P(y|x) often turn out to be both semantically interpretable and useful for modelling some other related function P(z|x).
As just a few examples of this:
  • Units of a convolutional network trained to classify scenes often learn to detect specific objects in those scenes (such as a lighthouse), even though they were not explicitly trained to do so (Zhou et al., 2015)
  • Correlations in the bottom layers of an image classification network provide a surprisingly good signature for the artistic style of an image, allowing new images to be synthesised using the content of one and the style of another (Gatys et al., 2015)
  • recurrent network [correction below] taught to predict missing words from sentences learns meaningful word embeddings, where simple vector arithmetic can be used to find semantic analogies. For example:
    • vking - vman + vwoman ≈ vqueen
    • vParis - vFrance + vItaly ≈ vRome
    • vWindows - vMicrosoft + vGoogle ≈ vAndroid
I have no doubt that the next few years will see neural networks turn their attention to yet more tasks, integrate themselves more deeply into industry, and continue to impress researchers with new superpowers. This is all well justified, and I have no intention to belittle the current and future impact of deep learning; however, the optimism about the just what these models can achieve in terms of intelligence has been worryingly reminiscent of the 1960s.

Extrapolating from the last few years’ progress, it is enticing to believe that Deep Artificial General Intelligence is just around the corner and just a few more architectural tricks, bigger data sets and faster computing power are required to take us there. I feel that there are a couple of solid reasons to be much more skeptical.
To begin with, it is a bad idea to intuit how broadly intelligent a machine must be, or have the capacity to be, based solely on a single task. The checkers-playing machines of the 1950s amazed researchers and many considered these a huge leap towards human-level reasoning, yet we now appreciate that achieving human or superhuman performance in this game is far easier than achieving human-level general intelligence. In fact, even the best humans can easily be defeated by a search algorithm with simple heuristics. The development of such an algorithm probably does not advance the long term goals of machine intelligence, despite the exciting intelligent-seeming behaviour it gives rise to, and the same could be said of much other work in artificial intelligence such as the expert systems of the 1980s. Human or superhuman performance in one task is not necessarily a stepping-stone towards near-human performance across most tasks.
By the same token, the ability of neural networks to learn interpretable word embeddings, say, does not remotely suggest that they are the right kind of tool for a human-level understanding of the world. It is impressive and surprising that these general-purpose, statistical models can learn meaningful relations from text alone, without any richer perception of the world, but this may speak much more about the unexpected ease of the task itself than it does about the capacity of the models. Just as checkers can be won through tree-search, so too can many semantic relations be learned from text statistics. Both produce impressiveintelligent-seeming behaviour, but neither necessarily pave the way towards true machine intelligence.
I’d like to reflect on specifically what neural networks are good at, and how this relates to human intelligence. Deep learning has produced amazing discriminative models, generative models and feature extractors, but common to all of these is the use of a very large training dataset. Its place in the world is as a powerful tool for general-purpose pattern recognition, in situations where n and d are high. Very possibly it is the best tool for working in this paradigm.
This is a very good fit for one particular class of problems that the brain solves: finding good representations to describe the constant and enormous flood of sensory data it receives. Before any sense can be made of the environment, the visual and auditory systems need to fold, stretch and twist this data from raw pixels and waves into a form that better captures the complex statistical regularities in the signal*. Whether this is learned from scratch or handed down as a gift from evolution, the brain solves this problem adeptly - and there is even recent evidence that the representations it finds are not too dissimilar from those discovered by a neural network. I contend, deep learning may well provide a fantastic starting point for many problems in perception.
That said, this high n, high d paradigm is a very particular one, and is not the right environment to describe a great deal of intelligent behaviour. The many facets of human thought include planning towards novel goals, inferring others' goals from their actions, learning structured theories to describe the rules of the world, inventing experiments to test those theories, and learning to recognise new object kinds from just one example. Very often they involve principled inference under uncertainty from few observations. For all the accomplishments of neural networks, it must be said that they have only ever proven their worth at tasks fundamentally different from those above. If they have succeeded in anything superficially similar, it has been because they saw many hundreds of times more examples than any human ever needed to.
Deep learning has brought us one branch higher up the tree towards machine intelligence and a wealth of different fruit is now hanging within our grasp. While the ability to learn good features in high dimensional spaces from weak priors with lots of data is both new and exciting, we should not fall into the trap of thinking that most of the problems an intelligent agent faces can be solved in this way. Gradient descent in neural networks may well play a big part in helping to build the components of thinking machines, but it is not, itself, the stuff of thought.

Thursday, January 14, 2016

EN MI OPINION :El miedo pierde la batalla ante el amor y la confianza

Por: Ricardo Tribín Acosta

El miedo, ese tan temido y desagradable sentimiento, a veces nos paraliza y nos pinta la vida de una manera trágica, horrible, espantosa. Es el elixir de veneno que bebemos en muchos casos sin darnos ni siquiera cuenta del porqué. Está presente, aunque lo ignoremos, y en la mayor parte de los casos lo grave es que es que tan solo producto de nuestra propia imaginación y por consiguiente una creación de nosotros mismos.

Un hombre se fue a clavar una puntilla para colgar un clavo, y  aunque, sin saber el motivo, le tenía miedo a los martillos por lo que el impacto pudiera causar sobre su dedo. Lo pensó una y otra vez hasta que al fin se decidió y, cuando dio el primer martillazo, cataplum, este no fue directamente al clavo sino a su dedo. Auchhhh, grito el hombre. Yo sabía que esto me iba a quedar grande. Pues claro, no ves que lo venias programando de antemano, le dijo un amigo.

El miedo está ahí aunque lo ignoremos y la única forma de apartarlo reside en el cambio de nuestra manera de pensar y de ver las cosas y las situaciones tal y como son en la realidad. De ahí que cuando logremos recuperar la confianza en nosotros y en la acción que vamos a emprender y a ello le sumemos el componente del amor, el miedo confrontado por la realidad y el equilibrio mental, tomará las de Villadiego.

Miami, Enero 14 de 2016

Copyright © *Ricardo Tribin|*, All rights reserved.

Our email address is:

Tuesday, January 12, 2016

Florida Chamber: Capitol Days to Feature Putnam, Jennings, Thrasher, Weatherford

Adam Putnam, Toni Jennings, John Thrasher, Will Weatherford
to Launch 100-Year Celebration at
Florida Chamber Board of Governors Capitol Days

The Florida Chamber has been championing Florida's economy for nearly 100 years-- fighting for a better future since 1916.

Please join us as we honor our past and look toward securing an even stronger future for Floridians during our Board of Governors Capitol Days and Annual Meeting, held January 13-15 in Tallahassee.

We will launch our 100-year celebration on Wednesday, January 13, at the 100 Years of Securing Florida's Future: Legacy of Leadership Dinner.

Speakers include:
  • Past Florida Chamber of Commerce Chair, Former Florida Lieutenant Governor and Former Florida Senate President Toni Jennings
  • Florida State University President, Former Speaker of the Florida House and Florida SenatorJohn Thrasher
  • Florida Chamber of Commerce Board Member, and Former Speaker of the Florida House Will Weatherford
  • Florida Agriculture Commissioner Adam Putnam
  • Emcee: Ron Sachs, President and CEO, SachsMedia Group 
Florida Chamber Board of Governors Capitol Days
January 13-15, 2016
FSU Turnbull Conference Center, Tallahassee, FL

Contact Sarah Spagnola at 850-521-1292 or for sponsorship opportunities and more information.

Monday, January 11, 2016

How Valeant Tripled Prices, Doubled Sales of Flatlining Drug (BusinessWeek)

  • Why Schiller as Interim CEO Is Good for Valeant
  •  Wellbutrin antidepressant gets new life with free co-pays
  • List price increased 11 times since 2014 to $17,000 a year

No wonder investors once loved Michael Pearson. In short order, he managed to double sales of Wellbutrin XL, the popular antidepressant, even though far cheaper generics were out there.

But behind that move is an untold story that illustrates how Pearson and his controversial company, Valeant Pharmaceuticals International Inc., vaulted to seemingly unimaginable heights before falling back to Earth.

On Wednesday Valeant named an interim chief executive officer to replace Pearson, who is on medical leave with severe pneumonia. But the Wellbutrin XL sales strategy, like those of other Valeant drugs, is drawn from the playbook that Pearson followed while building Valeant into one of the hottest pharmaceutical companies around. Now it may raise new questions for investors about how the company can sustain growth.

Wellbutrin XL was transformed into a top seller in part via a relationship with a specialty pharmacy -- an arrangement similar in some ways to the one that plunged Valeant into crisis when it first came to light last year. While the relationship may account for only a portion of Wellbutrin XL sales, it nonetheless sheds light on how Valeant has landed at the center of a drug-marketing flap by often picking up the cost of co-pays for consumers and then steadily increasing the prices to insurers.

Even as the number of prescriptions filled for the drug have declined, dollar sales of Wellbutrin XL, Valeant’s third-best seller in the third quarter, have kept climbing as Valeant raised the price 11 times over the past two years. Wellbutrin XL, a once-daily version of a 30-year-old antidepressant, costs $1,400 a month, compared with a $30 generic version. Sales were on track to exceed $300 million in 2015, as of the third quarter, up from around $150 million in 2013.

Wellbutrin XL price increases, moreover, appear to be part of a broader Valeant strategy. Of more than 250 brand-name drugs surveyed at several companies, only prices for 10 Valeant products rose at double-digit rates in each of the first three quarters of 2015, according to a Bloomberg Intelligence analysis. For Wellbutrin XL, the average quarterly increase was 21 percent. The retail price for a year’s supply of the 300-milligram dosage is now about $17,000.

Securing Reimbursement

One way Valeant has been increasing sales involves the specialty pharmacy, Direct Success Inc. It’s based in Farmingdale, New Jersey, not far from the one in Pennsylvania, Philidor RX Services LLC, that sparked questions about Valeant and left Pearson struggling to contain the damage.

Specialty pharmacies are nothing like your local drugstore. They have been around for years to handle expensive, specialized drugs for patients with complex diseases such as cancer or multiple sclerosis. But some, like Philidor, have recently become controversial for the lengths they go to fill prescriptions with brand-name drugs and then secure insurance reimbursement.

Health plans and other critics say some specialty pharmacies have evolved into little more than marketing extensions for Valeant and other drug companies to push ordinary brand-name drugs where cheaper generics may be appropriate. And it’s unusual in the drug industry for an old brand-name drug with generic competition to enjoy the sort of sales burst that Wellbutrin XL has had.

There’s nothing illegal about selling drugs through a mail-order pharmacy. And some doctors prefer to prescribe branded drugs, including Wellbutrin XL, over generics because they believe they may be more effective or have fewer side effects.

‘No Hassles’

Direct Success manages Valeant’s Wellbutrin XL Guarantee Program. It offers the drug to consumers for low co-pays, or for free, while assuring doctors that their prescriptions will be honored with “no hassles.”

“Wellbutrin XL is sold through many channels, including the commercial channel, Medicare, Medicaid and the Department of Defense,” said Laurie Little, a spokeswoman for Valeant. Sales of Wellbutrin XL through a marketing program involving Direct Success “accounted for less than 5 percent of Wellbutrin XL sales,” she said.

In a statement posted online Friday, Valeant said that Direct Success has “no incentive to maximize reimbursement” for the drug. Direct Success receives a discount on the product and a fee when a prescription is filled, “regardless of whether they get any money from insurance.”

Cheryl McDaniel, the founder of Direct Success, said the private company owned by her and her husband operates independently from Valeant. She declined to comment on what other drugs the company may sell.

“We are committed to the highest ethical standards,” said McDaniel. “Any broad comparisons between Direct Success and Philidor would be inaccurate.”

Unlike Philidor, Direct Success has handled mail-order marketing for companies other than Valeant. In response to concerns that Philidor’s prescription-filling practices were inappropriate, Valeant has severed ties with the pharmacy, which is closing.

‘Tip of the Iceberg’

Valeant’s marketing relationship with Direct Success "raises the question of whether Philidor was just the tip of the iceberg," said Mark Merritt, president and CEO of the of the Pharmaceutical Care Management Association. Such relationships tend “to get patients to buy a much more expensive product they don’t need,” said Merritt, whose group represents pharmacy benefit managers, the companies that operate drug plans for insurers and employers.

Web surfers who land on the Wellbutrin XL site, or are directed there by doctors, can click on a guarantee program that promises patients with generous commercial insurance will “pay $0” for "unlimited use." Consumers whose insurance pays less than $100 are charged $50 a month.

An enrollment page for doctors enables them to approve Wellbutrin XL prescriptions without generic substitutes simply by signing and initialing a one-page form. “No hassles and no need for call-backs -- guaranteed. Your prescription decision is never questioned by the pharmacy,” the website says.

‘Under the Radar’

Richard Evans of SSR Health, an investment researcher in Montclair, New Jersey, said that Valeant has been able to pass through price increases for old drugs such as Wellbutrin XL because they have not drawn the attention of insurers, who have tended to focus their cost-cutting efforts on the bigger-selling drugs. 

"They were under the radar and really had not drawn payers’ attention," he said. Now that Valeant has attracted so much scrutiny, insurers are likely to start refusing to pay a premium for drugs like Wellbutrin XL, he said.

Aetna Inc. continues to cover Wellbutrin XL in most of its plans, although it encourages patients to use generics, a company spokeswoman said. Express Scripts Holding Co. said most of its plans cover the branded drug only if a physician vouches that cheaper generics have not worked. CVS Health Corp. said that Wellbutrin XL is subject to its highest co-pay level in its main plan and is not covered at all in some plans.

Even if some insurers refuse to reimburse for drugs like Wellbutrin XL, it can be profitable for drug companies to pick up the tab for co-pays, often via coupons offered to patients. If enough insurers pay to cover such drugs, the manufacturers more than make up for those that don’t. Manufacturers can earn a 4-to-1 to 6-to-1 return on investment on co-pay coupon programs, according to the Pharmaceutical Care Management Association. SSR Health estimates Valeant last year received about half the list price for Wellbutrin XL.

Growth Spurt

The rebound in Wellbutrin XL sales appears to roughly coincide with a growth spurt at Direct Success. McDaniel, a 58-year-old graduate of Ohio’s Miami University with a background in direct marketing, founded the company two decades ago in her home. Her husband David, the co-owner, is a former business editor at the Asbury Park Press. In July 2014, her husband’s former paper re-published a press release proclaiming "an explosion in growth" that more than doubled the size of Direct Success over two years.

Last June, Direct Success opened a second operations center in central Ohio, which raised the pharmacy’s head count to roughly 140. The opening of the Ohio facility also coincided with the beginning of Wellbutrin XL’s third-quarter sales spurt.

"Drugs that have recently come off patent (LOE) have found a second life by using our pharmacy," according to the Direct Success website, referring to the loss of exclusivity a branded drug faces when generics enter the market.

Friday, January 8, 2016

The International Banking System Faces an Existential Threat (Forbes)

By Mark Fleming-Williams

Christmas did not offer much good cheer to the world’s bankers, who have received a sustained kicking since the financial crisis erupted in 2008. In the latest blow, Switzerland announced that it would hold a referendum on a radical proposal that would strip commercial banks of the ability to create money, depriving them of a great deal of their profit-making capabilities. If the Swiss proposal catches on around the world, it could shred core business assumptions that have underpinned the banking model over the past three centuries.

From Babylon to Central Bank

The earliest banks we know of, in ancient Babylon, were temples that doubled as repositories where one could store wealth. At some point, the guardians of the stored treasure realized they could put this accumulated wealth to work, and banks accordingly began to lend capital. Borrowers would pay interest on what they borrowed, and this interest would ultimately find its way back to the lenders after the banks had taken a cut. The banks became trusted intermediaries that brought lender and borrower together and ensured neither would be cheated. Paper money emerged after people found it was easier to buy things using deposit slips from their bank than carrying gold around.

The next evolution happened when bankers realized that since depositors almost never simultaneously withdrew all their funds, banks could lend more capital than had been deposited. This allowed banks to “create” money in the sense that bankers could issue loans not necessarily backed up by hard deposits. Creating revenue in this way proved lucrative, but it brought banks into conflict with rulers, who were notionally in charge of the state’s money supply and any gains to be made from it. In England, whose financial system is in many ways the progenitor of today’s global system, this battle was played out between banker and ruler in the 16th and 17th centuries.

Ultimately, in 1666 King Charles II — well aware of the limits of his own power thanks to the beheading of his father 17 years earlier — put control of the money supply into private hands. The privatization of the money creation process gave birth to the system we use today, in which private or commercial bank loans are responsible for 97 percent of the money circulating in the modern global economic system. In another change, 28 years after Charles II’s reform, an enterprising group of businessmen offered the government cheaper loans in exchange for certain privileges, such as a monopoly over the printing of physical currency, and so the Bank of England was born.

The benefits of the new system proved immediately apparent. Interest rates on government borrowing dropped from 10-14 percent in the 1690s to 5-6 percent in the early 1700s. This allowed Britain a great deal of leeway when it came to military spending, which it soon put to use. But the privatization of money creation also came with drawbacks, namely the economic cycle of boom and bust. Leaving the money-lending and -creating decisions up to banks resulted in a system of extremes where bankers created speculative bubbles via vast quantities of loans and money when times were good, only to refuse to lend — in a sense destroying money — once an ensuing speculative bubble burst.

This led to liquidity crises, with the South Sea Bubble of 1720 providing early evidence of this mechanism kicking into action. The fact that banks were lending more money than they could back up with capital also left them exposed to bank runs whenever the public lost confidence in them. The reserve ratio, which requires banks to keep a fraction of their loans backed by safer assets such as government debt or central bank money, is an attempt to keep this threat at bay. But it is an inherent characteristic of so-called fractional reserve banking that the risk of bank runs is ultimately inescapable.

Britain, and indeed all the other countries that came to adopt the system, grew accustomed to a regular waxing and waning of the money supply and to the consequent up-and-down economy. There were ways to palliate this cycle, with the Bank of England slowly developing into the stabilizing force it is today. In times of crisis, the Bank of England would lower interest rates and flood the market with liquidity, bailing out any solvent but illiquid banks to keep the system functioning, thus smoothing the money supply’s wilder fluctuations.

As British and then American influence spread, so did banks’ power, and capital flowed ever more freely around the world as domestic deposits were used to finance international projects. The system was heading for a fall, however, when World War I created great economic imbalances between Europe and the United States. In the 1920s, the Federal Reserve attempted to restore prewar parity by keeping interest rates artificially low, but this led to abundant speculative U.S. capital flooding across the Atlantic, particularly into Germany. The ensuing giant bubble finally popped in 1929, leading to the dramatic liquidity shortages of the Great Depression and creating the circumstances that culminated in World War II. The experience led to the partial reining in of banks, with the Glass-Steagall legislation in the United States in the early 1930s limiting their ability to take part in speculative investments.

Time has a way of chipping away at such precautions, however, and the banks gradually escaped their shackles and capital came to flow freely around the world once again. More countries became accustomed to the ebb and flow of bubble and crisis, though these crises tended to be more regional in scope (e.g., Latin America, Asia, Scandinavia). When global crisis finally struck again in 2008 it was different from 1929 in that there was no world war to blame for the global economic imbalances; this crisis followed an extended period of the banks having had things pretty much their own way. Instead, it was a giant version of the regular crises inherent in the system. This led to the thinking that it is the banks, and indeed the system they created around themselves, that need changing. In the eight years since 2008, layer upon layer of 1933-style regulation and restriction have thus been heaped on the banking sector.

A Radical Reform

It is into this atmosphere that the idea of stripping banks of their money-creating abilities has gained currency (regained, in fact, since calls for it date back at least to the 1930s). According to its proponents, the way to root out the instability inherent to the system is to require banks to back their loans 100 percent with reserves. This essentially would be a step back to the point where banks would again function as conduits rather than creators of capital. Under the reformed system the creation of new money would instead be the prerogative of the central bank and the government. These national institutions would in theory be motivated by the needs of the state rather than by short-term profit and would keep the money supply growing at a fixed rate, doing away with the wild fluctuations of the credit cycle. (One challenge to overcome would be politicians attempting to hijack the money supply for short-term political gain.) Proponents of such a system point to many expected benefits: bank runs would be eliminated, the proceeds of money creation would go to the government and thus the taxpayer rather than to the banking elite, government debt would be a thing of the past, and private debt would be greatly reduced. (Indeed, the predominance of debt in today’s world is partly a product of it being required in the money creation process.)

But there also would be great risks involved, the main one being the fear of the evil unknown. Though the economic instabilities of the past 300 years appear to have resulted largely from the fractional reserve system, was it also responsible for the relatively breakneck growth over the same period? Moreover, the changeover from one system to the other would be extremely tricky, requiring vast quantities of central bank money-printing and debt buybacks. That would be a recipe for an extremely fraught period carrying immense risks of mismanagement. In truth, another full-blown financial crisis may have to take place before such a changeover could be made at the global level.

But the theoretical upsides are great, as are frustrations with the current system, and the idea has begun to gather momentum. In 2012, the International Monetary Fund published an influential research paper laying out the case for the proposed system, and in 2015 the Icelandic government commissioned a report on the prospect of undertaking the changes. In Switzerland, a law requiring a referendum to take place should 100,000 signatures be gathered has set the country on a course to possibly being first to undertake the great experiment. Strikingly, the revolution is being considered at both ends of the spectrum: Iceland has lately proved among the most financially adventurous players on the global economic scene, while Switzerland has long been one of the most conservative. Considering the risks involved, adoption in a smaller economy such as Iceland or Switzerland would be a useful test case from a global perspective. It would limit the cost of failure to the global economy while helping establish the best way of adopting the changes should the reforms actually work.

For banks, the prospect is of course nothing less than a nightmare scenario, especially coming on top of all of their existing woes. These have included not only increased regulation but also the threat from a disruptive new technology undercutting their basic model in the form of Bitcoin, the new electronic currency that emerged almost exactly as the financial crisis struck. While Bitcoin has suffered its own wild fluctuations in the eight years since its birth, the technology that underpins it, Blockchain, has truly historic potential. The architects appear to have created an electronic system in which both parties in a transaction can act with confidence without the need for an intermediary, though there is some added risk for the payer, since reversing transactions is more difficult than in traditional banking. The world’s banks therefore face both the prospect of losing their money-creation privileges, as well as a potential usurper threatening their long-established role as the middleman through which all capital must flow. As 2015 fades into 2016, it is hard to think of a time in the past 300 years when the banker’s position in society has been more at risk.

Thursday, January 7, 2016

Little Havana Walking One Stop

Miami-Dade Anti-Gang Strategy
The Miami-Dade Anti-Gang Strategy is a collaborative effort by federal, state, and local agencies, as well as non-governmental organizations to reduce gang-related violence while enhancing public safety through the integration of evidence-based prevention, intervention, suppression, and ex-offender reentry initiatives.

Little Havana Walking One Stop
The award-winning Walking One Stop is the most innovative and comprehensive coordinated community response to violence that involves elected officials, faith leaders, social and economic service providers along with criminal justice personnel and concerned community activists bringing social and economic services to the doorstep of residents living in neighborhoods that have experienced recent, severe, or persistent incidents of violence. In addition to doorstep service delivery, the Walking One Stop staging area can host your mobile unit or tabletop display thus maximizing your outreach potential. Federal, state, and local agencies are encouraged to bring brochures and we will assist you in distributing them.
On Monday, January 25th at 9:30 a.m., in partnership with the City of Miami Little Havana Neighborhood Enhancement Team (NET), we will gather for a short briefing at the Little Havana NET Office located at 1300 SW 12th Avenue, Miami, and then at 10:00 a.m. the Walking One Stop motorcade will roll out to a Little Havana neighborhood to be announced.Please RSVP by 5:00 p.m. on Friday, January 22nd to and include your name, affiliation, and contact information.
Please post and distribute this invitation to those in your network.  Everybody talks the talk but are you willing to walk the walk?
© 2015 Wayne E. Rawlins

Tuesday, January 5, 2016

The Best-Paid U.S. Executives Don't Work on Wall Street: Chart (BusinessWeek)

Nick Woodman, founder and chief executive officer of GoPro Inc., tops the list as the highest-paid executive in 2014.

Nick Woodman, founder and chief executive officer of GoPro Inc., tops the list as the highest-paid executive in 2014. 

For U.S. executives, the road to riches doesn’t necessarily run through Wall Street. Among the country’s 200 highest-paid senior managers, those at consumer-discretionary companies were awarded an average $48.6 million. That’s more than 35 percent above the mean paycheck of their colleagues in the finance industry, according to the Bloomberg Pay Index, which ranks compensation using the most recent data as of a company’s fiscal year-end.

GoPro Inc. founder Nick Woodman tops the list with a $287.2 million pay package that made him the highest-paid executive for 2014, followed by Liberty Global Plc CEO Michael Fries, whose pay was valued at $139.4 million. Six of the 10 best-paid executives worked at consumer-discretionary firms, which include entertainment companies such as Discovery Communications Inc. and retailers such as TJX Cos.
The top-paid finance manager was William P. Foley, executive chairman of title insurer Fidelity National Financial Inc., with a $104.9 million package.