walls.corpus

By Nathan L. Walls

  • Sunflower Field/Raleigh
  • Attention/Raleigh
  • Empty Cases/Raleigh

Articles tagged “civics”

🔗 MRI costs: Why this surgeon is challenging NC’s certificate of need law

Dylan Scott Writing for Vox:

Dr. Gajendra Singh walked out of his local hospital’s outpatient department last year, having been told an ultrasound for some vague abdominal pain he was feeling would cost $1,200 or so, and decided enough was enough. If he was balking at the price of a routine medical scan, what must people who weren’t well-paid medical professionals be thinking?

The India-born surgeon decided he would open his own imaging center in Winston-Salem, North Carolina, and charge a lot less. Singh launched his business in August and decided to post his prices, as low as $500 for an MRI, on a banner outside the office building and on his website.

There was just one barrier to fully realizing his vision: a North Carolina law that he and his lawyers argue essentially gives hospitals a monopoly over MRI scans and other services.

I hope Dr. Singh’s lawsuit succeeds. American healthcare in 2018 is supposed to be driven by consumerism. Call around to different providers and determine how much you’ll pay for quality care. Choose a provider based on wherever you want to land on the quality/price matrix that accepts your insurance and you’re golden, right?

No.

Healthcare is not a market. First, not all qualified players can join the market, as is the case here. That effectively prevents Dr. Singh (and others) from putting downward pressure on prices. Second, medical pricing isn’t necessarily discoverable, transparent or negotiable.

NCGA Republican leadership to everyone else: 'Drop dead'

Paul A. Specht and Will Doran writing for the News & Observer, covering the latest in a long series of the North Carolina GOP freezing out Democrats from involvement in the legislative process:

Democrats are upset that Republican legislators are mostly excluding them from state budget talks, as it’s unlikely any proposed changes will be adopted once the budget is revealed.

Republican leaders plan to gut an old bill and amend it as a “conference report” to include their budget plans, meaning state lawmakers will have no method for amending the legislation.

The NCGA leadership isn’t denying the accusation.

Shelly Carver, a spokeswoman for Senate leader Phil Berger, said the purpose of the short session is to adjust the two-year state budget that was passed over a six-month period last year — “not to write an entirely new plan.” Republicans hold a supermajority in the House and Senate, so it’s unclear whether Democratic proposals would be adopted even if under a more open process.

“It’s clear Gov. Cooper and legislative Democrats are upset they won’t be able to abuse that process to try to score political points in an election year, but lawmakers of both parties will have the opportunity to vote on the bill and make their voices heard,” Carver wrote in an email.

The thing is, this supermajority is the result of an unconstitutional racially-based gerrymander. Delaying motions have allowed the state GOP to delay a reckoning with redrawing both the state legislative districts and the state’s congressional districts. So, the reason why Democrats won’t be heard during the budget process is because the NC GOP explicitly set up the process to allow exactly this.

“(A)n entirely new plan” talks past recent events in Raleigh, specifically the May 18, 2018 rally of state public school teachers in Raleigh for better pay and better school funding.

Jeff Jackson, a state senator representing Charlotte put it this way, as part of a thread put it thusly:

Ultimately, this is about teachers. Republicans know that Democrats are going to offer amendments to raise teacher pay and Republicans don’t want to be on record voting against that. So they’re going to torpedo the whole process to avoid publicly saying “No” to teachers.

Both my NC Senate and NC House representatives and neither is involved in this process. I effectively have no representation at the state level.

A tool to facilitate questions about Triangle tactical team usage

Yesterday, David Forbes of The Asheville Blade tweeted the results of a records request he made to the local law enforcement agencies in Asheville. He also published a story on The Asheville Blade resulting from the records request:

The unrest in Ferguson, Mo. has raised a multitude of important issues, including systemic racism in law enforcement and the level of violence directed at African-American citizens, like the disturbing shooting of unarmed teenager Michael Brown by a Ferguson police officer.

On Thursday, Aug. 14, I made records requests with the Asheville Police Department and Buncombe County Sheriff’s Office to disclose all the military equipment obtained under this program over the past decade. The Sheriff’s Office responded within 30 minutes and wrote that they are in the process of gathering the information.

A city spokesperson replied the next morning with a similar response. Later that day, they revealed that the only item the APD had obtained through the 1033 program since 2004 was an armored car in 2007. According to city officials, the vehicle is no longer in use.

Just seeing Forbes’ tweets, I was wondering about local to the Triangle research on the same topic.

The New York Times has a worthwhile interactive map that Forbes referenced in his story. That’s a start. Going back to my brief story about having the (a?) Raleigh Police Department tactical team in our backyard, I can tell you it was incredibly stressful seeing multiple officers carrying rifles, bringing dogs and shining flashlights in the dark of the woods behind our house looking for one or more armed robbery suspects.

To me, this instance seemed an appropriate instance to have a tactical team present. However, I want to know other instances where this team would be deployed. So, I’m thinking through some questions I would like to have access to the answers to:

  • Which Triangle law enforcement agencies – local, state or federal – have tactical/S.W.A.T. teams?
  • How large is each agency’s tactical team?
  • What equipment have these agencies acquired from the federal government that a reasonable person consider “war gear”?
  • Are the tactical teams the only teams with access to this equipment?
  • What rules govern the use of this equipment?
  • Are agencies obligated to use the equipment or return it to the federal government?
  • Under what circumstances is a tactical team activated?
  • Who is responsible for activating a tactical team?
  • Are these teams and equipment preemptively deployed to public events? When and why?
  • How many times a month are these teams deployed?
  • How many times a month is military-grade equipment deployed?
  • Where are these deployments?
  • Would less forceful tactics have been more or less effective? Why?
  • What reports are available regarding these deployments?
  • Were any complaints against officers filed in the wake of the incident?
  • Can these deployments be correlated with news reports of the incident?
  • What are the trends of deployment? Are they going up or down?

A local news agency would do well to ask these questions and report the answers. They would do better to get an ongoing update of records from area law enforcement agencies and get at trending data or look at particular incidents in more detail. This is in the ballpark of something I could expect to see from EveryBlock, a local Code for America brigade – the Triangle has several – or again, any of the local news agencies as a public-facing, web accessible application.

I’m thinking through these questions because these are sorts of questions I would like answered in the wake of the police response to protests in Ferguson, Missouri after the death of Mike Brown. I want you to think through questions of your own along these lines. These questions don’t replace community involvement in policing through oversight or review commissions. They don’t replace community policing. They don’t replace beat reporting. Instead, these questions that a tool helps answer should inform us for deeper conversations about what law enforcement agencies are doing in claim of protecting and serving the public. The goal should be to increase transparency and build trust that communities and governments understand where police powers are used, why they’re used and citizens believe these powers, when used, are used judiciously, proportionately, appropriately, equally and fairly.

wget of mass destruction

David E. Sanger and Eric Schmitt, reporting for the New York Times, have published an article titled “Snowden Used Low-Cost Tool to Best N.S.A.”. I know they’re reporting for a general audience, but I believe the article does a disservice by allowing anonymous national security “officials” to put simple automation into scare quotes:

Using “web crawler” software designed to search, index and back up a website, Mr. Snowden “scraped data out of our systems” while he went about his day job, according to a senior intelligence official. “We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” the official said. The process, he added, was “quite automated.”

The findings are striking because the N.S.A.’s mission includes protecting the nation’s most sensitive military and intelligence computer systems from cyberattacks, especially the sophisticated attacks that emanate from Russia and China. Mr. Snowden’s “insider attack,” by contrast, was hardly sophisticated and should have been easily detected, investigators found.

Automation gonna automate, I suppose. Given that we’ve seen this dance with Aaron Schwartz, Chelsea Manning and Edward Snowden, the national security-industrial complex has a disingenuously naïve view of automation tools, particularly around Schwartz at MIT and Snowden, suggesting there was a mix of luck and quite possibly something nefarious to all this automation. The New York Times should approach statements made by agency officials skeptically. This sort of programming is not hard. Moreover, no one has to work particularly hard to hide this. In fact, what might look to some like “hiding” would simply be polite engineering under a different lens.

One key is a not-at-all-advanced concept of throttling. Well-behaved web crawlers (also known as spiders) are respectful about how many requests they issue in a given amount of time. A lot of requests all at once will attract the very sort of attention unnamed officials seem beside themselves to acknowledge Snowden only barely called to himself.

First, lots of requests in a short amount of time shows up in log files as such and quickly becomes a pattern. Patterns attract attention. Assuming the NSA and it’s various contractors audit access logs (which itself is something I’d automate), spreading requests over time makes it less likely to arouse suspicion. Moreover, unless an audit is looking for a particular type of activity, that manual or automated audit will not care a whit about well-throttled crawler traffic, because it looks a lot like expected traffic. It’s “hiding” to the same degree someone of average height and dress is “hiding” as they walk on a Manhattan sidewalk.

Second, setting aside any activity logs, system activity monitors seem more likely to catch a misbehaving web crawler. System activity monitors look at how much work a machine is doing at a given time. Typical checks look at how busy the CPU is, how much RAM is in use, overall network activity, what processes (“programs”) are running and so on. Some servers have automated checks in place, some don’t. For sake of discussion, I assert the servers hosting the content Snowden accessed were monitored in such a fashion. Now, assume each server has a variable amount, but average band of activity. Unless what Snowden was doing with his web crawler caused one of these checks to go out-of-bounds, it was unlikely to attract attention. Normal activity gets ignored.

On to the alleged crawling software itself.

In interviews, officials declined to say which web crawler Mr. Snowden had used, or whether he had written some of the software himself. Officials said it functioned like Googlebot, a widely used web crawler that Google developed to find and index new pages on the web. What officials cannot explain is why the presence of such software in a highly classified system was not an obvious tip-off to unauthorized activity.

First, Snowden’s job was as a systems administrator. Systems administration and development jobs involve access to not in any way top secret technologies like *NIX servers which typically have a wide-array of built-in scripting languages (Perl and Python most likely, Ruby very possibly). Or, perhaps Snowden is a shell scripter. Bash will get the job done.

As software goes, a basic web crawler is not exceptionally hard. I assert if its written with tools likely already resident on any average server or *NIX-based laptop (e.g. Mac OS X, Linux, possibly Windows with PowerShell), there’s really nothing about one that would raise any particular suspicion. Effectively, the raw pieces of the web crawler were quite likely already present. Writing a text file to marshal these raw pieces together is unlikely to raise suspicion because a systems administrator or software developer already has scores of similar files laying around. There’s not a magic “web crawler” bit that flips and will alert anyone.

As a thought experiment, what happens if every machine is audited and new and modified files are flagged, logged and sent off somewhere for analysis? Probably nothing, because in a large working group, a lot of these files are going to look very similar to each other, have innocuous or cryptic names and it would be a nigh-impossible task to write meaningful software to determine what all of these new files are for and, if they’re programs, what they do. Surely, no one is going to look at each one of these files. It’d be soul-sucking work.

Put another way; hammers, screwdrivers, wrenches, pliers, saws, knives aren’t noteworthy tools in a tool box. A new hammer on a construction site is unlikely to raise any attention. Similarly, just as carpenters use jigs, painters use scaffolding and auto mechanics use impact wrenches, ramps and hydraulic lifts to make their jobs easier, faster, more consistent and less tedious, systems engineers and developers use scripts. Now, imagine a construction site or factory inspecting everyone’s tool bag and workspace constantly for anything “inappropriate”. It wouldn’t be terribly effective and it’d be a huge burden and expense on the actual work. Imagine your average TSA security line at the office park.

There’s also some question about the web crawler having Snowden’s credentials:

When inserted with Mr. Snowden’s passwords, the web crawler became especially powerful. Investigators determined he probably had also made use of the passwords of some colleagues or supervisors.

But he was also aided by a culture within the N.S.A., officials say, that “compartmented” relatively little information. As a result, a 29-year-old computer engineer, working from a World War II-era tunnel in Oahu and then from downtown Honolulu, had access to unencrypted files that dealt with information as varied as the bulk collection of domestic phone numbers and the intercepted communications of Chancellor Angela Merkel of Germany and dozens of other leaders.

Officials say web crawlers are almost never used on the N.S.A.’s internal systems, making it all the more inexplicable that the one used by Mr. Snowden did not set off alarms as it copied intelligence and military documents stored in the N.S.A.’s systems and linked through the agency’s internal equivalent of Wikipedia.

As noted above, there’s nothing particularly special about a web crawler versus any other manner of script. It’s easy to inform utilities like wget and curl about authentication parameters and keep login cookies. It’s also easy for such a web crawler to announce itself to the server it requests information from in any manner. There’s a convention around giving an identification string, as Google and Yahoo do for their web crawlers, but it’s just as easy to call a web crawler Mozilla/5.0 (Windows NT 6.3; Trident/7.0; rv:11.0) like Gecko or Internet Explorer 11. Add in polite engineering of not requesting every page the web crawler sees as soon as it processes each preceding page and it’s going to be far less obvious that traffic to a web server is coming from a script instead of a human clicking a link. There’s not necessarily anything nefarious going on.

If Snowden had access to all of these systems and accessing what sounds equivalent to a corporate intranet was not going to arouse suspicion, there’s little I can think about this conceptual web crawler that would tip the balance into being caught. If the NSA wasn’t going to catch Snowden doing all of the work himself, it’s no more likely they were going to catch an automated process he wrote.

I don’t find any part of this story surprising from a technical standpoint. What I do find somewhat distressing is that unnamed officials think this is special or conveys villainous status on Snowden. It doesn’t, just as it should not have with Aaron Schwartz. Said officials should actually know better and if they don’t, they need to find technical advisors who will correctly inform them.

I bring this all up because I would like for reporters on stories such as this to find an average systems administrator, security analyst or software engineer to talk to in order to provide perspective. The New York Times has an excellent digital staff with developers who could easily demonstrate what a similar script would look like and how it would work and look internally. Surely, a news organization that builds great interactive stories and is growing more comfortable in its own clothes online can use some agency and draw on some of the experience that’s helping to provide some of that comfort to call officials on bad, self-serving analysis like this.

← Previous