Microtaur

Note to self: Check the size of a monster in the Monster Manual before printing it in 3D. I printed a group of minotaurs of medium size, a head taller than a human. That was basically the size I had in mind because of the tauren in World of Warcraft. But then I realized that in D&D a minotaur is of “large” size, which means that he takes up 2×2 spaces on a battlemap. So I need to print him with a 2″ base, and make him at least 2″ tall for that to look proportional. So I threw away my “microtaurs” and printed the group in large size instead.

The adventure I am preparing has a number of large or even bigger monsters: Minotaurs, a beholder, a hill giant, an oni, and a dragon. And I must say that I am quite pleased with how those came out from 3D printing. The larger models have less problems of thin parts being too thin to print right. The details come out a lot better. And as the software automatically fills the bulk with a mostly hollow support structure, I can print them to scale without spending a fortune. In the role-playing club I play in there is a cupboard with a collection of painted metal miniatures. But metal is expensive as a material, and heavy in bulk, so the large monsters in that collection are actually not bigger than the medium ones. The beholder in the collection is a sphere of less than 1″ diameter, so my 2″ sphere beholder looks impressive compared to it, even if mine is just plastic and unpainted. Not to mention my 4″ tall hill giant and dragon, which I think will really impress my players.

Common HTTP Errors

Every HTTP transaction has a status code sent back by the server to define how the server handled the transaction.
Apart from the 404 error, how many other HTML error pages do you know about? Have you ever thought about what happens in the background when you see any of these HTML error pages on your screen?
Those codes are meant to convey important information to the user. Using them properly reduces your bounce rate, improves your search engine ranking and gives you knowledge on the performance of your site.

Status Codes

Status codes come in the format of 3 digit numbers. The first digit marks the class of the status code:
1XX status codes have informational purposes
2XX indicates success
3XX is for redirection
None of these three classes result in an HTML error page as in this cases the client knows what to do and goes on with the task without hesitation. 

What we usually see are the 4XX and 5XX kind:

4XX represent client-side errors
5XX indicate problems on the server side
HTML error pages are displayed in these cases because the client has no idea about what how to move on.

Lets see some Client side and Server side HTTP error codes

Client Side Errors(4XX)

400 – Bad Request

Whenever the client sends a request the server is unable to understand, the 400 Bad Request error page shows up. It usually happens when the data sent by the browser doesn’t respect the rules of the HTTP protocol, so the web server is clueless about how to process a request containing a malformed syntax.

Open the same webpage in a different browser, clear the cache, and check if you are due with security updates. If you regularly meet the 400 error on different sites, your PC or Mac is awaiting a thorough security checkup.

401 – Authorization Required

When there’s a password-protected webpage behind the client’s request, the server responds with a 401 Authorization Required code. 401 doesn’t return a classical error message at once, but a popup that asks the user to provide a login-password combination.

403 – Forbidden

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.By returning the 403 status code the server basically rejects the client with a big loud “No” without any explanation.
The most common reason is that the website owner doesn’t permit visitors to browse the file directory structure of the site.

404 – Not Found


The server has not found anything matching the Request-URI. No indication is given of whether the condition is temporary or permanent.

408 – Request Time-Out

When the request of the client takes too long, the server times out, closes the connection, and the browser displays a 408 Request Time-Out error message. The time-out happens because the server didn’t receive a complete request from the client within the time frame it was prepared to wait.

410 – Gone

The requested resource is no longer available at the server and no forwarding address is known. This condition is expected to be considered permanent. Clients with link editing capabilities SHOULD delete references to the Request-URI after user approval.

If the server does not know, or has no facility to determine, whether or not the condition is permanent, the status code 404 Not Found SHOULD be used instead. This response is cacheable unless indicated otherwise. It’s a good idea to distinguish between 404 and 410 to enhance your Google-friendliness. 

Server Side Errors

500 – Internal Server Error

The server encountered an unexpected condition which prevented it from fulfilling the request.
If you encounter the 500 error page on your own site, it will be wise to contact your hosting provider. The reason is most likely a permission error, a corrupt .htaccess file or a too low memory limit.

502 – Bad Gateway

The 502 error message represents a communication problem between two servers. It occurs when the client connects to a server acting as a gateway or a proxy that needs to access an upstream server that provides additional service to it.

503 – Service Unavailable

Your web server is unable to handle your HTTP request at the time. There are a myriad of reasons why this can occur but the most common are:

  • server crash
  • server maintenance
  • server overload
  • server maliciously being attacked
  • a website has used up its allotted bandwidth
  • server may be forbidden to return the requested document
  • This is usually a temporary condition. Since you are getting a return code, part of the server is working. The web people have made the server return this code until they fix the problem.

If you do not get service back soon, contact your web host as they would know the best. Some web hosts have server status pages you can check.

504 – Gateway Time-Out

There is a server-server communication problem behind the Gateway Time-Out error message, just like behind the 502 Bad Gateway error code. When the 504 status code is returned there’s also a higher-level server in the background that is supposed to send data to the server that is connected to our client. In this case the lower-level server doesn’t receive a timely response from the upstream server it accessed.

Want to learn Web Programming?

Gardmore Abbey 5E rerun – Session 1

I ran the 4th edition Dungeons & Dragons adventure Madness at Gardmore Abbey back in 2013/2014 and consider it to be the best official 4E adventure. So now I am running it again in a 5E version with different players. I’m not going to write a blow-by-blow journal on this one, but I do want to write down an outline of what happened and my thoughts on it.

In the first session the players started in Winterhaven, where they learned that the card of the Deck of Many Things they had found was one of a set. Lord Padraig of Winterhaven has at least one other card, and is interested in the full deck in order to defend his town. So he told the group about Gardmore Abbey, where most of the cards apparently are scattered, and asks them at the same time to scout the layout and number of orcs there. The players manage to get more information about the abbey from the library of the mage Arris, from Lord Padraig’s counselor Valthrun, and from a bard singing ballads about the place in the inn. Lord Padraig also provided the group with a squire and horses for faster travel between Winterhaven and the abbey.

At the abbey I gave the players a picture of a front view of Gardmore Abbey. And after the scouted the outside of the abbey a bit more, I gave them my player map of Gardmore Abbey. Now the principle of the adventure and the map is that the players can approach the abbey from any side they want: Frontal assault on the main gate, climbing the wall to the north of the gate, going through a hole in the wall to the south of the gate, or try to get up the hill from the un-walled back side. From all the groups I’ve read on the internet having played the adventure none ever choose the frontal assault on the main gate. In reality the main gate isn’t all that heavily guarded, but a frontal assault on a large army of orcs just doesn’t appear to be a good idea.

So this group went through the hole in the south wall. From there they could go to the watchtower, or straight up the hill through a fey forest, or north along the wall back towards the orc main keep. They first tried the watchtower as a probably vantage point, where they saw some weird scenes from back in time through the windows. They decided not to pursue that further, still didn’t want to move towards the orcs, and thus went up the hill through the fey forest, in order to get a view from the top.

So they came across a magical fountain where a group of high elves was camped. The elves were mistrustful, but not hostile. Their leader Berrian Velfarren told the adventurers that he was here in search of traces of his father, who disappeared centuries ago. He also believed that there were documents somewhere giving the elves some claim on the fey forest. And his sister Analastra had gone missing. After receiving some visions from the magical spring, the group followed the path further up the hill. They came across the groundskeeper’s cottage, where they fought the owlbears now inhabiting it and found the documents the elves were looking for. Then the came to the garden behind the main keep, where another group of rival adventurers were fighting spiders. Trying to help them resulted in the rival adventurers disengaging and leaving the heroes with the spiders. But they did found a sword they had heard about in a ballad about a lost paladin.

Further up the path the group came across some nymphs playing a game of telling each other secrets, and learned some of the secrets of the abbey, including the fact that the missing father had last been seen in the watchtower. Then they came to a bell tower, where Analastra was fighting two displacer beasts and a nest of stirges. The highlight of that fight was the druid keeping Analastra alive with healing words, while using a Call Lightning spell to damage the displacer beasts and eliminating the nest of stirges. Having rescued the sister, the group returned to the elves to rest there.

As they had already finished two of the three quests of the elves, and Berrian had promised them his card of the Deck of Many Things for finishing all three, the group headed to the watchtower next. The elves had said that they couldn’t find an entrance to it. But after some experimentation it turned out that the group’s card opened the door. But stepping inside the group was trapped in some extra-dimensional space connected to the Far Realm, a plane of chaos. In the first room they fought a black pudding (who destroyed the druid’s armor) and two mimics, who had been disguised as cards forming a bridge. After that fight we stopped because it was getting late. But having finished encounters 13, 12, 9, 10, 11, and 14 of the adventure was good progress, 6 encounters out of 33.

On the combat side the encounters were tough, which was mostly because of two players missing from the group of five. Next session we should be up to 4 players, which will be easier. But I didn’t have to cheat or remove monsters, the adventure was still doable with just 3 players of level 5. They earned about 40% of the xp needed towards level 6, so I think that by the end of the adventure they will be at least level 7, if not 8. However I don’t really have a good follow-up adventure for level 8 characters in store, as all of the official 5E adventures start at low level. Except for the Rise of Tiamat, but that one is the second part of a story that starts with Hoard of the Dragon Queen.

Masters and Servants

If you watch a film or TV series like Downton Abbey, you can learn about how the class structure of society worked a century ago. Many of those concepts of hereditary masters and servants are now completely outdated. But while class borders have become a lot more flexible today, classes still do exist. In today’s economy there are still masters, who are the customers paying for a service, and servants, who then get money for providing those services. Of course the guy who is a servant all day, for example an Uber driver, can come home and become the master by ordering a pizza delivered. But the rich are more likely to receive services, and the poor are more likely to provide those services; we aren’t really much more equal than back in the days of Downton Abbey.

This class divide has also reached games. If you can afford to buy $60 games or spend money in Free2Play games, you get services provided to you. If you play those Free2Play games for free, you end up being the content for other players. It is as if you were paid for providing a service as opponent for another player, only that you don’t get paid in cash but in access to the game.

I don’t like being a servant to a game company. Game companies, like most other companies, treat their customers like royalty, and their employees like garbage. So I don’t want to work for the game company, be the content, provide a service as a cheap replacement of an artificial intelligence. In particular I hate games where even if you pay money, you never can escape from that role as servant, because you always are content for other players.

I just can’t play the new Magic Arena, because it only has a PvP mode. Not only don’t I like serving as content for other players. I also don’t like the content that other players provide to me: Playing against random humans means total unpredictability, you can end up against a complete pushover or the guy who spent hundreds of dollars and hours on the game and is a complete pro. On the one side I feel bad if I play against a human and have to quit early because real life intervenes (which makes the game rather unsuitable for mobile platforms), but on the other side I hate it when my opponent quits early. I much prefer playing against an AI, where there is no social contract, and my opponent plays in a more predictable manner. Previous electronic versions of Magic the Gathering have proven that an AI can be created that plays the game reasonably well. So making a version of Magic without AI to me feels like simple exploitation of players as content, and I’m not willing to be exploited like that.

What is Ethereum? — a short guide

What is Ethereum EthereumPrice

You may be asking yourself, “What is Ethereum?” Well, Vitalik Buterin, a Canadian programmer born in Russia, invented Ethereum in 2015 by. It’s a cryptocurrency much like Bitcoin that allows you to make payments online. It’s decentralized, offers low transaction fees, and runs on a publicly disclosed blockchain that records each transaction.

Read: What is a blockchain? – Gary Explains

Ethereum’s currency is called Ether and is currently the second largest in the world in market cap, behind Bitcoin. There are reportedly around two million wallets that hold it, up from 1.6 million in May — showing the growing popularity of Ether.

How is it different from Bitcoin? Bitcoin aims to become a globally adopted currency that could improve or even replace conventional money. Ethereum, on the other hand, is more than a cryptocurrency. It’s also a ledger technology used to build decentralized applications (dapps) with smart contracts.

What are smart contracts?

Wikimedia

Smart contracts are programs that automatically execute exactly as they are set up by their creators. Their purpose is to offer more security by removing the middlemen that we would otherwise have to use. Confused? Let’s take a look at a simple example.

Let’s say you want to ship a large gift to your friend and hire a trucker to do the job. For the trucker to know you’ll pay him, and for you to be sure the delivery will be made, you both sign an agreement for shared peace of mind. This takes time and can be expensive, as you need someone who will draw up the paperwork for you, and so forth.

This process can be simplified with a smart contract. You make the payment the day the package is picked up, and the smart contract will automatically transfer the money to the trucker as soon as your friend confirms the delivery has been made.

How is Ether created and where can I get it?

CoinSpectator

Like Bitcoins, Ethers are created through a process called mining. This requires expensive and specialized computers that have to perform complicated calculations. Mining is mainly done by large companies that are compensated for their work with newly minted Ethers.

Editor’s Pick

Unfortunately, you won’t make any money by mining with your personal PC, even if it’s a high-end model. So how can you get your hands on Ethers? You can earn them by providing goods and services to people who can pay you with the digital currency. The second option is to buy them from a marketplace like Coinbase with your credit card.

The Ethers you own are stored in a wallet secured with a private key. You can keep it in the cloud or offline, with the latter being a much safer option. The important thing is that you don’t lose the private key. If that happens, you won’t be able to access your money.

How much does it cost and what determines the price?

Crypto-News

Now that we have figured out the answer to the “What is Ethereum?” question, how much do Ethers really cost? Ethers were cheap when introduced back in 2015 — you could get one for less than a dollar. Their price has risen over the years and currently stands at around $430 each (exact value can be found in widget below). The sharp increase means Ethers can be a great investment, same as Bitcoins and many other cryptocurrencies. For example, if you bought $1,000 worth of Ethers in 2015 when they were worth $0.50 a piece, you would have $860,000 today.

Before you get too excited, keep in mind that investing in cryptocurrencies can be risky.

Before you get too excited, sell your house, and buy as many Ethers as you can get, let me remind you that investing in cryptocurrencies can be risky. Sure, a lot of them have increased in value in recent years, but that doesn’t mean this trend will continue. Cryptocurrencies are volatile, meaning their price can go up and down significantly in a single day. This makes them less stable than standard currencies like the dollar and euro.

How exactly do we determine their value? Like Bitcoins, gold, oranges, and every other item available on the market, supply and demand determine the price of Ethers.

The Merkle


Ethereum can be hard to understand at times. The same goes for Bitcoins and the rest of the cryptocurrencies available. But the fact is that they’re here to stay and might become a more important part of our daily lives in the future.

Many experts believe Ethereum has a lot of potential and could overtake Bitcoin as the largest cryptocurrency somewhere down the line. This is all speculation, though well within the realm of possibility. But like with stocks, gold, and other investments, no one can be 100 percent sure in which direction the price will move.

Hopefully we have given you an answer to the “What is Ethereum?” question. What are your thoughts on Ethereum and cryptocurrencies in general? Let us know in the comments.

Web scrapping tools,Sooo Muuch Data – Analysis Needed !

Web Scraping Tools

What is Web Scrapping?


Web Scraping (also termed Screen Scraping, Web Data Extraction, Web Harvesting etc.) is a technique employed to extract large amounts of data from websites. Web scraping software may access the World Wide Web directly using the Hypertext Transfer Protocol, or through a web browser.
While web scraping can be done manually by a software user, the term typically refers to automated processes implemented using a bot or web crawler. It is a form of copying, in which specific data is gathered and copied from the web, typically into a central local database or spreadsheet, for later retrieval or analysis.
These tools are useful for anyone trying to collect some form of data from the Internet. Web Scraping is the new data entry technique that don’t require repetitive typing or copy-pasting.
These software look for new data manually or automatically, fetching the new or updated data and storing them for your easy access. For example, one may collect info about products and their prices from Flipkart using a scraping tool.
Lets see some Web scrapping tools:

1. import.io

import.io offers a builder to form your own datasets by simply importing the data from a particular web page and exporting the data to CSV. You can easily scrape thousands of web pages in minutes without writing a single line of code and build 1000+ APIs based on your requirements.
Import.io uses cutting-edge technology to fetch millions of data every day, which businesses can avail for small fees. Along with the web tool, it also offers a free apps for Windows, Mac OS X and Linux to build data extractors and crawlers, download data and sync with the online account.

2. Webhose.io

Webhose.io provides direct access to real-time and structured data from crawling thousands of online sources. The web scraper supports extracting web data in more than 240 languages and saving the output data in various formats including XML, JSON and RSS.


Webhose.io is a browser-based web app that uses an exclusive data crawling technology to crawl huge amounts of data from multiple channels in a single API. It offers a free plan for making 1000 requests/ month, and a 5K/mth premium plan for 5000 requests/month.


3Scrapinghub:

Scrapinghub is a cloud-based data extraction tool that helps thousands of developers to fetch valuable data. Scrapinghub uses Crawlera, a smart proxy rotator that supports bypassing bot counter-measures to crawl huge or bot-protected sites easily.
Scrapinghub converts the entire web page into organized content. Its team of experts are available for help in case its crawl builder can’t work your requirements. Its basic free plan gives you access to 1 concurrent crawl and its premium plan for $25 per month provides access to up to 4 parallel crawls.

4. 80legs:

80legs is a powerful yet flexible web crawling tool that can be configured to your needs. It supports fetching huge amounts of data along with the option to download the extracted data instantly. The web scraper claims to crawl 600,000+ domains and is used by big players like MailChimp and PayPal.
Its ‘Datafiniti‘ lets you search the entire data quickly. 80legs provides high-performance web crawling that works rapidly and fetches required data in mere seconds. It offers a free plan for 10K URLs per crawl and can be upgraded to an intro plan for $29 per month for 100K URLs per crawl.

5. ParseHub:

ParseHub is built to crawl single and multiple websites with support for JavaScript, AJAX, sessions, cookies and redirects. The application uses machine learning technology to recognize the most complicated documents on the web and generates the output file based on the required data format.
ParseHub, apart from the web app, is also available as a free desktop application for Windows, Mac OS X and Linux that offers a basic free plan that covers 5 crawl projects. This service offers a premium plan for $89 per month with support for 20 projects and 10,000 webpages per crawl.

Want to learn Database Programming?

Another Positive Factor From the Alabama Election That Republicans Don’t Want to Talk About

Tuesday’s turnout by race fit historic patterns, but the party white Alabamians voted for didn’t.

There’s one feature of the voting in this week’s Alabama special election that elected Democrat Doug Jones to the U.S. Senate that Republicans aren’t talking about—tens of thousands of white voters who were reliable Republicans voted for the Democrat.

This observation is missing from the mainstream media narrative that correctly, but incompletely, points to historically through-the-roof black voter turnout as a core pillar of Jones’ victory, as Matt Bruenig, who blogs on politics and economics, noted.

“The overwhelming mainstream narrative of Doug Jones’s victory over Roy Moore in Alabama has been focused on black turnout,” Bruenig wrote, citing the New York Times, which reported, “According to CNN exit polling, 30 percent of the electorate was African-American, with 96 percent of them voting for Mr. Jones. (Mr. Jones’ backers had felt he needed to get north of 25 percent to have a shot to win.) A remarkable 98 percent of black women voters supported Mr. Jones. The share of black voters on Tuesday was higher than the share in 2008 and 2012, when Barack Obama was on the ballot.”

But as Bruenig notes, “if you actually look at the exit polling, it is pretty clear that the real story of Jones’ victory was not inordinate black turnout but rather inordinate white support for the Democratic candidate.”

He compiled and compared “the black share of the electorate, black support for Democrats, and the election result for the 2008, 2012 and 2017 Alabama elections,” in which each election saw 28 or 29 percent voter turnout. He then compared the white share of the Alabama electorate, which was also virtually unchanged and between 65 and 68 percent. 

“The white share of the electorate is virtually unchanged, but white support for the Democrat changes dramatically, rising all the way to 30 percent in the Jones-Moore election,” Bruenig said. “This white swing towards the Democratic candidate is basically solely responsible for the fact that Jones won rather than losing by over 20 points, which is the typical outcome of a statewide Alabama election that features this level of black turnout.”

Bruenig’s observation doesn’t detract in the slightest from the historic turnout by all the communities of color in Alabama. But it does reveal that many Republicans are not diehard partisans who would never vote for a compelling Democratic candidate.

Some of those white voters were 18 to 44, as media exit polls noted, but others were “white women and college graduates… likely to recoil from Trump’s campaign and swing in Democrats’ direction than white men and those without college degrees.”

However you slice it, Alabama’s special election shows that red-state America is not as monolithic as Republicans would have you believe. That’s another hopeful sign to emerge from Tuesday’s vote.

 

Related Stories

  • Can Democrats Extend the Wave That Swept Doug Jones Into Office in 2018?
  • Why Can’t Alabama Republicans Admit Doug Jones Won Fair and Square?
  • Russian Propaganda on Social Media in 2016 Has Forever Changed How U.S. Candidates Will Campaign

How abundance makes us poorer

Maybe it was to be expected with an offer that involves charity, but it turns out that for me the Humble Bundle Monthly is mostly an investment in a source for philosophical thoughts. When I initially bought the bundle in order to get Civ VI for cheap, I went for the three-month plan. So even if I since unsubscribed I just got my second months worth of games. And compared to the first month, there are even less games in there which I can see me playing. That is not to say that the offer is a bad one, or the games on offer are bad. Rather it reflects upon how my interests got narrower over time.

I am old enough to remember a time before video games. The first video game I played was Pong on a console that couldn’t play anything else, in black and white on a TV screen. When people got the first consoles with cartridges and computers, kids typically had just a handful of games, not necessarily chosen by themselves. If you only have 3 game cartridges, you will play the hell out of each of those games, whether those are your favorite games or not. Fast forward to 2017, where 7,672 games were released on Steam alone, again nearly doubling the number of Steam games available for a fourth year in a row.

Everybody has favorite games and favorite genres. If you are limited by the number of games available to you, you play what you got regardless of genre. If you have an abundance of choice, you get more and more picky and only play your favorite genres. The bottleneck becomes the amount of time available to play, so why should you play let’s say a platformer if you prefer role-playing games? Of course the consequence of that is that you end up with a much narrower experience. You only play a handful of favorite genres and don’t have the time for a bunch of other genres, which might offer a very different experience of gaming.

I see a parallel to the world of news and politics. Back in the day where your only source of news was one paper you and everybody in your street was subscribed to, you all got the same variety of news and opinions. Today there are so many sources of news and opinions that you can choose one which aligns well with your own opinions. If you are a fan of Trump, you watch Fox News and read Breitbart, if you are on the other side you watch CNN and read Huffington Post. But the result is that you end up in an echo chamber which doesn’t allow for a variety of opinions. This has gone so far that the echo chambers of today don’t even agree on the same set of facts. A news source that reports something uncomfortable to you is “fake news”, truth has become subservient to opinion.

The future is one in which we lead comfortable lives in which we play only our favorite games, see only our favorite genre of movies and TV shows, hear only news that please us. Until we have become so isolated from another group of people (which might well be our neighbors) that the two groups don’t consider each other of being of the same kind any more, and start killing each other off. The internet, which had a promise of offering us a much wider offer of everything from information to entertainment, ends up making us all poorer and more narrow-minded.

How Big Data Analytics Make Cities Smarter?

Smart city and big data

There has been a lot of activity around the concept of Smart City for some time. Cities are being identified as future smart cities. Theoretically at least, smart cities can fundamentally change our lives at many levels such as less pollution, garbage, parking problems and more energy savings. Though the prospect seems mouth-watering, the implementation of the smart city concept around the world has been sporadic at best because of several reasons. Whatever the stage the smart city implementation is at globally, big data and the Internet of Things (IoT) have the power to drive the implementation.

Undoubtedly, the main strength of the big data concept is the high influence it will have on numerous aspects of a smart city and consequently on people’s lives. Big data is growing rapidly, currently at a projected rate of 40 % growth in the amount of global data generated per year versus only 5 % growth in global IT spending. Around 90 % of the world’s digitized data was captured over just the past two years. As a result, many governments have started to utilize big data to support the development and sustainability of smart cities around the world. That allowed cities to maintain standards, principles, and requirements of the applications of smart city through realizing the main smart city characteristics. These characteristics include sustainability, resilience, governance, enhanced quality of life, and intelligent management of natural resources and city facilities.

Big Data in Smart Cities

If major cities were to invest into smart transport systems today, then by 2030 they would save around $800 billion annually. On top of that, smart transport systems also contribute in a few other ways, including:
  • Less automobile congestion and fewer accidents
  • More advancements in faster long distance travel
  • Clean air from the reduction of pollution
  • Excess of new jobs from updates in transportation networks
  • Furthermore, any upgraded transportation option appeals to established businesses looking for a new locale, as they do to startup businesses. Any business wants to know that their workers and clients have access to efficient modern transportation. That access lowers annual budgets for businesses in terms of what they pay in gas mileage and delivery costs.
Big data tracks transportation infrastructure needs and costs helping cities define ways to expand their public transport options in the most efficient way possible. It defines what areas of the city need to open up and how receptive people are about initiatives to raise money for such a project. Cities that use this type of big data analytics are called smart cities and much of the world wants in on the innovations.
Many major cities are starting to use INRIX, a system that analyzes data from traditional road sensor networks and mobile device data. San Francisco’s Metropolitan Transportation Commission saved over $250,000 per year from the direct data collection of INRIX. 

Big Data in Law Enforcement

Contrary to popular belief, in terms of fighting crime, big data is actually allowing police and other law enforcement officers to behave less like Big Brother than more. Data analytics allows law enforcement officers to track real trouble spots and dangerous criminals.


Many local agencies are starting to use PREDPOL or predictive policing systems that collect three main data points from every report: type of crime, location and time of the incident, to make accurate officer deployment decisions in the future.

PREDPOOL



Once high criminal activities are identified, new education initiatives and outreach programs can be utilized in those jurisdictions.


Big Data in Education

The collection and analysis of big data helps educators understand which students need help, why they need help as well as identifying areas in which they excel.
Educators can provide relevant individual and group activities to support each student’s goals and needs. Teachers will be able to assess student progress on a consistent basis in order to challenge students and help them grow.
The analytics provide more three-dimensional insights of their students’ progress while allowing parents a way to understand how each child learns. 
AltSchool is one of the first K-8th grade school providing this personalized learning experience which is only available in developing smart cities such as San Francisco and New York.
altschool
The introduction of big data in the education space has encouraged students of all ages to learn remotely in the comfort of their homes. These massive open online courses collect data from millions of course takers and analyze it to find trouble areas that are causing students to fail. After analyzing millions of data points, algorithms continually updated each course to deliver an “adaptive learning experience” based on each individual’s strength, weaknesses and preferences.
These are just two examples of the many ways smart cities are adapting schools into more personalized and remote learning platforms which may change the learning experience forever.

Big Data in Health

The United Nations says that by 2050, 66% of the world’s population will be considered urban. With populations living in such close proximities, this means that health initiatives must be available to everyone no matter their background, race or economic status.
Big data can already predict the outbreaks of viruses and even track cases of depression. Smart cities will use millions of sensors that provide personalized medical services. Many citizens of smart cities will be able to activate their medical service by a mobile app or free standing kiosks throughout the city. Pulsepoint Respond is a great example of a personalized app that alerts CPR-trained bystanders of sudden cardiac arrests within their immediate area.
PlusPoint
On top of that, smart cities have already started testing systems that allow elderly patients the option to remain in their homes instead of at a nursing care facility. These type of systems include a standalone table, a tablet with Skype and wireless home sensors used for video communication between the patient and their remote caregiver.
The wireless sensors monitor the house and send alerts about safety situations such as a left-on stove or doors opening in the middle of the night. After testing this system in Oslo, Norway, the study has shown that the system can save $85,000 for each person since they don’t have to move into a nursing facility.

Big Data in Energy Usage.

Over 75% of the world’s energy consumption come from cities and 40% of municipal energy cost come solely from street lighting. Since adopting smart street lights which automatically adjust light levels to suit the needs of citizens, Lansing, Michigan saved 70% of their energy cost.
Experts predict that by 2020 there will be over 100 million of these smart light bulbs and lam>s used worldwide. Other cities like Charlotte, North Carolina have implemented smart building energy management which cut their total energy use by 8.4% and greenhouse gas emissions by 20%.
Moreover, the Spanish town of Santander installed 12,500 air pollution and RFID sensors around the city which diminished energy costs by 25% and waste management cost by an additional 20%. Smart cities are barely underway, yet they are already making substantial impact on the environment and to the citizens living in them.
Masdar City in Abu Dhabi and Songdo in South Korea are prime examples of connected cities that, using a local energy optimisation system, materialise the promises of a zero emission, zero waste model. All of the data from the sensors, spread throughout the city, are analysed in real time to optimise a number of aspects of inhabitants’ lives.


Want to learn Digital Marketing?