Skip to main content

Posts

Showing posts from April, 2015

What Is Tableau Software

Stanford University Computer Science department is the origin for Tableau. The initiation started in the Department of Defense, a sponsored research project. Chris Stolte, a Ph.D. candidate, was researching visualization techniques for exploring relational databases and data cubes.

Stolte's Ph.D. advisor, Professor Pat Hanrahan, a founding member of Pixar and chief architect for Pixar's RenderMan, was the worldwide expert in the science of computer graphics. Chris, Pat, and a team of Stanford Ph.D.s realized that computer graphics could deliver huge gains in people's ability to understand databases.
RelatedTableau 9 for Data Science Engineers Before Tableau Their invention VizQL™ brought together these two computer science disciplines for the first time. VizQL lets people analyze data just by building drag-and-drop pictures of what they want to see.  Tableau 8While Tableau 8 improves on the previous seven major releases of the software, the core approach to visual design r…

1000 SQL Queries For Practice (Part-1)

Welcome to SQL Tutorial. As I told in my previous posts. Learning SQL needs lot of Practice instead of reading more.

In my series of posts, I am giving best SQL examples for practice. So that you can become SQL professional with my tutorials.

Always Look For SQL/PL-SQL Jobs-Great Demand Ahead for people who has hands-on experience

1000 SQL QueriesDownload Free 1000 SQL Queries

Overview Of Cloud Standards

Cloud computing slowly becoming reality. So it has to address many concerns such as security, interoperability, portability, and governance at the earliest opportunity. This can be accelerated by compliance to guidelines and standards defined in consensus by the cloud providers. Without addressing these concerns, users would be wary to tread this path in spite of its powerful economic model for business computing.
Interoperability/integration Interoperability enables products/software components to work with or integrate with each other seamlessly, in order to achieve the desired result. Thus, it provides flexibility and choice to use multiple products to achieve our need. This is enabled by either integrating through standard interfaces or by means of a broker that converts one product interface to another.
Cloud computing depends on compliance standards.Security Security involves the protection of information assets through various policies, procedures, and technologies, which need t…

Oozie - Concepts And Architecture

Oozie is a workflow/coordination system that you can use to manage Apache Hadoop jobs.  It is one of the main components of Oozie is the Oozie server — a web application that runs in a Java servlet container (the standard Oozie distribution is using Tomcat).
Oozie is workflow management server works on Oozie server. This server supports reading and executing Workflows, Coordinators, Bundles, and SLA definitions. It implements a set of remote Web Services APIs that can be invoked from Oozie client components and third-party applications. Add a note where the execution of the server leverages a customizable database.

This database contains Workflow, Coordinator, Bundle, and SLA definitions, as well as execution states and process variables. The list of currently supported databases includes MySQL, Oracle, and Apache Derby. The Oozie shared library component is located in the Oozie HOME directory and contains code used by the Oozie execution.
What is Oozie Oozie provides a command-line i…

5 top Data Storage Patterns to handle variety of data

Data is now a variety of patterns. Data is now more than just plain text, it can exist in various persistence-storage mechanisms, with Hadoop distributed file system (HDFS) being one of them.

The way data is ingested or the sources from which data is ingested affects the way data is stored. On the other hand, how the data is pushed further into the downstream systems or accessed by the data access layer decides how the data is to be stored.
Role of RDBMS The need to store huge volumes of data has forced databases to follow new rules of data relationships and integrity that are different from those of relational database management systems (RDBMS). RDBMS follow the ACID rules of atomicity, consistency, isolation, and durability.

These rules make the database reliable to any user of the database. However, searching huge volumes of big data and retrieving data from them would take large amounts of time if all the ACID rules were enforced.
A typical scenario is when we search for a certain…

Data warehouse 2.0 in Big Data World

The new data warehouse, often called “Data Warehouse 2.0,” is the fast-growing trend of doing away with the old idea of huge, off-site, mega-warehouses stuffed with hardware and connected to the world through huge trunk lines and big satellite dishes.  The replacement is very different from that highly controlled, centralized, and inefficient ideal towards a more cloud-based, decentralized preference of varied hardware and widespread connectivity. In today’s world of instant, varied access by many different users and consumers, data is no longer nicely tucked away in big warehouses.  Instead, it is often stored in multiple locations (often with redundancy) and overlapping small storage spaces that are often nothing more than large closets in an office building.  The trend is towards always-on, always-accessible, and very open storage that is fast and friendly for consumers yet complex and deep enough to appease the most intense data junkie. Read more here

Internet Of Things Basics (Part-1)

IBM investing $3 billion dollars on internet of things(IOT). What is IOT -It estimates that 90 per cent of all data generated by devices like smartphones, tablets, connected vehicles and appliances is never analysed or acted on. IoT Basics to Read Now In simple terms, it means machine-to-machine connecting. The emergence of the Internet of Things (IoT) destroys every precedent and preconceived notion of network architecture. To date, networks have been invented by engineers skilled in protocols and routing theory.
Related:Internet+of+things+jobs
But the architecture of the Internet of Things will rely much more upon lessons derived from nature than traditional (and ossified, in my opinion) networking schemes.

It will consider the reasons why the architecture for the Internet of Things must incorporate a fundamentally different architecture from the traditional Internet, explore the technical and economic foundations of this new architecture, and finally begin to outline a solution to t…