Monday 30 September 2013

Web Scraper Shortcode WordPress Plugin Review

This short post is on the WP-plugin called Web Scraper Shortcode, that enables one to retrieve a portion of a web page or a whole page and insert it directly into a post. This plugin might be used for getting fresh data or images from web pages for your WordPress driven page without even visiting it. More scraping plugins and sowtware you can find in here.

To install it in WordPress go to Plugins -> Add New.
Usage

The plugin scrapes the page content and applies parameters to this scraped page if specified. To use the plugin just insert the

[web-scraper ]

shortcode into the HTML view of the WordPress page where you want to display the excerpts of a page or the whole page. The parameters are as follows:

    url (self explanatory)
    element – the dom navigation element notation, similar to XPath.
    limit – the maximum number of elements to be scraped and inserted if the element notation points to several of them (like elements of the same class).

The use of the plugin is of the dom (Data Object Model) notation, where consecutive dom nodes are stated like node1.node2; for example: element = ‘div.img’. The specific element scrape goes thru ‘#notation’. Example: if you want to scrape several ‘div’ elements of the class ‘red’ (<div class=’red’>…<div>), you need to specify the element attribute this way: element = ‘div#red’.
How to find DOM notation?

But for inexperienced users, how is it possible to find the dom notation of the desired element(s) from the web page? Web Developer Tools are a handy means for this. I would refer you to this paragraph on how to invoke Web Developer Tools in the browser (Google Chrome) and select a single page element to inspect it. As you select it with the ‘loupe’ tool, on the bottom line you’ll see the blue box with the element’s dom notation:


The plugin content

As one who works with web scraping, I was curious about  the means that the plugin uses for scraping. As I looked at the plugin code, it turned out that the plugin acquires a web page through ‘simple_html_dom‘ class:

    require_once(‘simple_html_dom.php’);
    $html = file_get_html($url);
    then the code performs iterations over the designated elements with the set limit

Pitfalls

    Be careful if you put two or more [web-scraper] shortcodes on your website, since downloading other pages will drastically slow the page load speed. Even if you want only a small element, the PHP engine first loads the whole page and then iterates over its elements.
    You need to remember that many pictures on the web are indicated by shortened URLs. So when such an image gets extracted it might be visible to you in this way: , since the URL is shortened and the plugin does not take note of  its base URL.
    The error “Fatal error: Call to a member function find() on a non-object …” will occur if you put this shortcode in a text-overloaded post.

Summary

I’d recommend using this plugin for short posts to be added with other posts’ elements. The use of this plugin is limited though.



Source: http://extract-web-data.com/web-scraper-shortcode-wordpress-plugin-review/

Sunday 29 September 2013

Microsys A1 Website Scraper Review

The A1 scraper by Microsys is a program that is mainly used to scrape websites to extract data in large quantities for later use in webservices. The scraper works to extract text, URLs etc., using multiple Regexes and saving the output into a CSV file. This tool is can be compared with other web harvesting and web scraping services.
How it works
This scraper program works as follows:
Scan mode

    Go to the ScanWebsite tab and enter the site’s URL into the Path subtab.
    Press the ‘Start scan‘ button to cause the crawler to find text, links and other data on this website and cache them.

Important: URLs that you scrape data from have to pass filters defined in both analysis filters and output filters. The defining of those filters can be set at the Analysis filters and Output filters subtabs respectively. They must be set at the website analysis stage (mode).
Extract mode

    Go to the Scraper Options tab
    Enter the Regex(es) into the Regex input area.
    Define the name and path of the output CSV file.
    The scraper automatically finds and extracts the data according to Regex patterns.

The result will be stored in one CSV file for all the given URLs.

There is a need to mention that the set of regular expressions will be run against all the pages scraped.
Some more scraper features

Using the scraper as a website crawler also affords:

    URL filtering.
    Adjustment of the speed of crawling according to service needs rather than server load.

If  you need to extract data from a complex website, just disable Easy mode: out press the  button. A1 Scraper’s full tutorial is available here.
Conclusion

The A1 Scraper is good for mass gathering of URLs, text, etc., with multiple conditions set. However this scraping tool is designed for using only Regex expressions, which can increase the parsing process time greatly.



Source: http://extract-web-data.com/microsys-a1-website-scraper-review/

Friday 27 September 2013

Visual Web Ripper: Using External Input Data Sources

Sometimes it is necessary to use external data sources to provide parameters for the scraping process. For example, you have a database with a bunch of ASINs and you need to scrape all product information for each one of them. As far as Visual Web Ripper is concerned, an input data source can be used to provide a list of input values to a data extraction project. A data extraction project will be run once for each row of input values.

An input data source is normally used in one of these scenarios:

    To provide a list of input values for a web form
    To provide a list of start URLs
    To provide input values for Fixed Value elements
    To provide input values for scripts

Visual Web Ripper supports the following input data sources:

    SQL Server Database
    MySQL Database
    OleDB Database
    CSV File
    Script (A script can be used to provide data from almost any data source)

To see it in action you can download a sample project that uses an input CSV file with Amazon ASIN codes to generate Amazon start URLs and extract some product data. Place both the project file and the input CSV file in the default Visual Web Ripper project folder (My Documents\Visual Web Ripper\Projects).

For further information please look at the manual topic, explaining how to use an input data source to generate start URLs.


Source: http://extract-web-data.com/visual-web-ripper-using-external-input-data-sources/

Wednesday 25 September 2013

A simple way to turn a website into JSON

Recently, while surfing the web I stumbled upon an simple web scraping service named Web Scrape Master. It is a kind of RESTful web service that extracts data from a specified web site and returns it to you in JSON format.
How it works

Though I don’t know what this service may be useful for, I still like its simplicity: all you need to do is to make an HTTP GET request, passing all necessary parameters in the query string:
http://webscrapemaster.com/api/?url={url}&xpath={xpath}&attr={attr}&callback={callback}

    url  - the URL of the website you want to scrape
    xpath – xpath determining the data you need to extract
    attr - attribute the name you need to get the value of (optional)
    callback - JSON callback function (optional)

For example, for the following request to our testing ground:

http://webscrapemaster.com/api/?url=http://testing-ground.extract-web-data.com/blocks&xpath=//div[@id=case1]/div[1]/span[1]/div

You will get the following response:

[{"text":"<div class='name'>Dell Latitude D610-1.73 Laptop Wireless Computer</div>","attrs":{"class":"name"}}]
Visual Web Scraper

Also, this service offers a special visual tool for building such requests. All you need to do is to enter the URL of the website and click to the element you need to scrape:
Visual Web Scraper
Conclusion

Though I understand that the developer of this service is attempting to create a simple web scraping service, it is still hard to imagine where it can be useful. The task that the service does can be easily accomplished by means of any language.

Probably if you already have software receiving JSON from the web, and you want to feed it with data from some website, then you may find this service useful. The other possible application is to hide your IP when you do web scraping. If you have other ideas, it would be great if you shared them with us.



Source: http://extract-web-data.com/a-simple-way-to-turn-a-website-into-json/

Tuesday 24 September 2013

Selenium IDE and Web Scraping

Selenium is a browser automation framework that includes IDE, Remote Control server and bindings of various flavors including Java, .Net, Ruby, Python and other. In this post we touch on the basic structure of the framework and its application to  Web Scraping.
What is Selenium IDE


Selenium IDE is an integrated development environment for Selenium scripts. It is implemented as a Firefox plugin, and it allows recording browsers’ interactions in order to edit them. This works well for software tests, composing and debugging. The Selenium Remote Control is a server specific for a particular environment; it causes custom scripts to be implemented for controlled browsers. Selenium deploys on Windows, Linux, and iOS. How various Selenium components are supported with major browsers read here.
What does Selenium do and Web Scraping

Basically Selenium automates browsers. This ability is no doubt to be applied to web scraping. Since browsers (and Selenium) support JavaScript, jQuery and other methods working with dynamic content why not use this mix for benefit in web scraping, rather than to try to catch Ajax events with plain code? The second reason for this kind of scrape automation is browser-fasion data access (though today this is emulated with most libraries).

Yes, Selenium works to automate browsers, but how to control Selenium from a custom script to automate a browser for web scraping? There are Selenium PHP and other language libraries (bindings) providing for scripts to call and use Selenium. It is possible to write Selenium clients (using the libraries) in almost any language we prefer, for example Perl, Python, Java, PHP etc. Those libraries (API), along with a server, the Java written server that invokes browsers for actions, constitute the Selenum RC (Remote Control). Remote Control automatically loads the Selenium Core into the browser to control it. For more details in Selenium components refer to here.



A tough scrape task for programmer

“…cURL is good, but it is very basic.  I need to handle everything manually; I am creating HTTP requests by hand.
This gets difficult – I need to do a lot of work to make sure that the requests that I send are exactly the same as the requests that a browser would
send, both for my sake and for the website’s sake. (For my sake
because I want to get the right data, and for the website’s sake
because I don’t want to cause error messages or other problems on their site because I sent a bad request that messed with their web application).  And if there is any important javascript, I need to imitate it with PHP.
It would be a great benefit to me to be able to control a browser like Firefox with my code. It would solve all my problems regarding the emulation of a real browser…
it seems that Selenium will allow me to do this…” -Ryan S

Yes, that’s what we will consider below.
Scrape with Selenium

In order to create scripts that interact with the Selenium Server (Selenium RC, Selenium Remote Webdriver) or create local Selenium WebDriver script, there is the need to make use of language-specific client drivers (also called Formatters, they are included in the selenium-ide-1.10.0.xpi package). The Selenium servers, drivers and bindings are available at Selenium download page.
The basic recipe for scrape with Selenium:

    Use Chrome or Firefox browsers
    Get Firebug or Chrome Dev Tools (Cntl+Shift+I) in action.
    Install requirements (Remote control or WebDriver, libraries and other)
    Selenium IDE : Record a ‘test’ run thru a site, adding some assertions.
    Export as a Python (other language) script.
    Edit it (loops, data extraction, db input/output)
    Run script for the Remote Control

The short intro Slides for the scraping of tough websites with Python & Selenium are here (as Google Docs slides) and here (Slide Share).
Selenium components for Firefox installation guide

For how to install the Selenium IDE to Firefox see  here starting at slide 21. The Selenium Core and Remote Control installation instructions are there too.
Extracting for dynamic content using jQuery/JavaScript with Selenium

One programmer is doing a similar thing …

1. launch a selenium RC (remote control) server
2. load a page
3. inject the jQuery script
4. select the interested contents using jQuery/JavaScript
5. send back to the PHP client using JSON.

He particularly finds it quite easy and convenient to use jQuery for
screen scraping, rather than using PHP/XPath.
Conclusion

The Selenium IDE is the popular tool for browser automation, mostly for its software testing application, yet also in that Web Scraping techniques for tough dynamic websites may be implemented with IDE along with the Selenium Remote Control server. These are the basic steps for it:

    Record the ‘test‘ browser behavior in IDE and export it as the custom programming language script
    Formatted language script runs on the Remote Control server that forces browser to send HTTP requests and then script catches the Ajax powered responses to extract content.

Selenium based Web Scraping is an easy task for small scale projects, but it consumes a lot of memory resources, since for each request it will launch a new browser instance.



Source: http://extract-web-data.com/selenium-ide-and-web-scraping/

Monday 23 September 2013

Has It Been Done Before? Optimize Your Patent Search Using Patent Scraping Technology

Has it been done before? Optimize your Patent Search using Patent Scraping Technology.

Since the US patent office opened in 1790, inventors across the United States have been submitting all sorts of great products and half-baked ideas to their database. Nowadays, many individuals get ideas for great products only to have the patent office do a patent search and tell them that their ideas have already been patented by someone else! Herin lies a question: How do I perform a patent search to find out if my invention has already been patented before I invest time and money into developing it?

The US patent office patent search database is available to anyone with internet access.

US Patent Search Homepage

Performing a patent search with the patent searching tools on the US Patent office webpage can prove to be a very time consuming process. For example, patent searching the database for "dog" and "food" yields 5745 patent search results. The straight-forward approach to investigating the patent search results for your particular idea is to go through all 5745 results one at a time looking for yours. Get some munchies and settle in, this could take a while! The patent search database sorts results by patent number instead of relevancy. This means that if your idea was recently patented, you will find it near the top but if it wasn't, you could be searching for quite a while. Also, most patent search results have images associated with them. Downloading and displaying these images over the internet can be very time consuming depending on you internet connection and the availability of the patent search database servers.

Because patent searches take such a long time, many companies and organizations are looking ways to improve the process. Some organizations and companies will hire employees for the sole purpose of performing patent searches for them. Others contract out the job to small business that specialize in patent searches. The latest technology for performing patent searches is called patent scraping.

Patent scraping is the process of writing computer automated scripts that analyze a website and copy only the content you are interested in into easily accessible databases or spreadsheets on your computer. Because it is a computerized script performing the patent search, you don't need a separate employee to get the data, you can let it run the patent scraping while you perform other important tasks! Patent scraping technology can also extract text content from images. By saving the images and textual content to your computer, you can then very efficiently search them for content and relevancy; thus saving you lots of time that could be better spent actually inventing something!

To put a real-world face on this, let us consider the pharmaceutical industry. Many different companies are competing for the patent on the next big drug. It has become an indispensible tactic of the industry for one company to perform patent searches for what patents the other companies are applying for, thus learning in which direction the research and development team of the other company is taking them. Using this information, the company can then choose to either pursue that direction heavily, or spin off in a different direction. It would quickly become very costly to maintain a team of researchers dedicated to only performing patent searches all day. Patent scraping technology is the means for figuring out what ideas and technologies are coming about before they make headline news. It is by utilizing patent scraping technology that the large companies stay up to date on the latest trends in technology.

While some companies choose to hire their own programming team to do their patent scraping scripts for them, it is much more cost effective to contract out the job to a qualified team of programmers dedicated to performing such services.




Source: http://ezinearticles.com/?Has-It-Been-Done-Before?-Optimize-Your-Patent-Search-Using-Patent-Scraping-Technology&id=171000

Online Data Entry Projects - Grab An Online Audience by Data Entry

In recent scenario, it is very tough to grab the business from every angle. You must require a very huge marketing budget as well as setup to mange the marketing teams. One of the biggest sources is online audience. You must grab their attention to generate more value.

It is not so hard to grab attention of online customers but you require more time and some person to work on. Rather doing this process at own, I suggest outsourcing the projects as online data entry. That will be surly beneficial for your company. This way, you can get out of time consuming and tedious task. Here is the small list of online data entry projects that can help you in developing reputation of your business:

Twitter Status Update:
Twitter is the most famous online community. Some of you may know as micro-blogging site. It is used to connect with peoples and exchanging thoughts. People join in but not have an idea what to post. To capture more and more attention, you consistently have to update the status. You can outsource online data entry project and get good business without spending time on it. My friend managing a small business of duct cleaning said "140 characters make much difference in your profit".

Articles:
Articles are one of the oldest sources of getting new clients and generating business from online source. This is the platform where you can give more information or knowledge about your services. You can describe the benefits, usefulness of product. You have to write about 250 to 350 words and to submit in article directories. This is out of focus task for you, so outsourcing as online data entry project is the best method. Matthew working in insurance firm told me "I am getting lot of customer from articles. If I am not getting help of articles, I am sure about kick-out from this job".

Blog Post Entries:
A blog is a personal place where you can share the latest updates and detailed information about your business. Various companies and individual has blogs but they are unable to mange those. Through online data entry, you can get professionals who can easily mange your blogs. My colleague is managing blog and getting 2000 visitor in just six months. So, outsource the online data typing projects and move your business a step ahead of competitor.

Bea Arthur is a quality controller at Data Entry India, a well-known firm, accepting data entry projects, data conversion projects and data processing projects. They are having more than 17 years of experience in online data entry



Source: http://ezinearticles.com/?Online-Data-Entry-Projects---Grab-An-Online-Audience-by-Data-Entry&id=4298308

Sunday 22 September 2013

Data Mining And Importance to Achieve Competitive Edge in Business

What is data mining? And why it is so much importance in business? These are simple yet complicated questions to be answered, below is brief information to help understanding data and web mining services.

Mining of data in general terms can be elaborated as retrieving useful information or knowledge for further process of analyzing from various perspectives and summarizing in valuable information to be used for increasing revenue, cut cost, to gather competitive information on business or product. And data abstraction finds a great importance in business world as it help business to harness the power of accurate information thus providing competitive edge in business. May business firms and companies have their own warehouse to help them collect, organize and mine information such as transactional data, purchase data etc.

But to have a mining services and warehouse at premises is not affordable and not very cost effective to solution for reliable information solutions. But as if taking out of information is the need for every business now days. Many companies are providing accurate and effective data and web data mining solutions at reasonable price.

Outsourcing information abstraction services are offered at affordable rates and it is available for wide range of data mine solutions:

• taking out business data
• service to gather data sets
• digging information of datasets
• Website data mining
• stock market information
• Statistical information
• Information classification
• Information regression
• Structured data analysis
• Online mining of data to gather product details
• to gather prices
• to gather product specifications
• to gather images

Outsource web mining solutions and data gathering solutions has been effective in terms of cost cutting, increasing productivity at affordable rates. Benefits of data mining services include:

• clear customer, service or product understanding
• less or minimal marketing cost
• exact information on sales, transactions
• detection of beneficial patterns
• minimizing risk and increased ROI
• new market detection
• Understanding clear business problems and goals

Accurate data mining solutions could prove to be an effective way to cut down cost by concentrating on right place.



Source: http://ezinearticles.com/?Data-Mining-And-Importance-to-Achieve-Competitive-Edge-in-Business&id=5771888

Friday 20 September 2013

Data Mining Explained

Overview
Data mining is the crucial process of extracting implicit and possibly useful information from data. It uses analytical and visualization techniques to explore and present information in a format which is easily understandable by humans.

Data mining is widely used in a variety of profiling practices, such as fraud detection, marketing research, surveys and scientific discovery.

In this article I will briefly explain some of the fundamentals and its applications in the real world.

Herein I will not discuss related processes of any sorts, including Data Extraction and Data Structuring.

The Effort
Data Mining has found its application in various fields such as financial institutions, health-care & bio-informatics, business intelligence, social networks data research and many more.

Businesses use it to understand consumer behavior, analyze buying patterns of clients and expand its marketing efforts. Banks and financial institutions use it to detect credit card frauds by recognizing the patterns involved in fake transactions.

The Knack
There is definitely a knack to Data Mining, as there is with any other field of web research activities. That is why it is referred as a craft rather than a science. A craft is the skilled practicing of an occupation.

One point I would like to make here is that data mining solutions offers an analytical perspective into the performance of a company depending on the historical data but one need to consider unknown external events and deceitful activities. On the flip side it is more critical especially for Regulatory bodies to forecast such activities in advance and take necessary measures to prevent such events in future.

In Closing
There are many important niches of Web Data Research that this article has not covered. But I hope that this article will provide you a stage to drill down further into this subject, if you want to do so!

Should you have any queries, please feel free to mail me. I would be pleased to answer each of your queries in detail.




Source: http://ezinearticles.com/?Data-Mining-Explained&id=4341782

Thursday 19 September 2013

Data Mining, Not Just a Method But a Technique

Web data mining is segregating probable clients out of huge information available on the Internet by performing various searches. It could be well organized and structured, or raw, depending on the use of the data. Web data mining could be done using a simple database program or investing money in a costly program.

Start collecting basic contact information of probable clients, such as: names, addresses, landline and cell phone numbers, email addresses and education or occupation if required.

CART and CHAID data mining

While collecting data you will find that tree-shaped structures that represent decisions. These derived decisions give rules for the classification of data collected. Precise decision tree methods include Classification and Regression Trees also know as CART data mining and Chi Square Automatic Interaction Detection also known as CHAID data mining. CART and CHAID data mining are decision tree techniques used for classification of data collected. They provide a set of rules that could be applied to unclassified data collected in prediction. CART segments a dataset creating two-way splits whereas CHAID segments using chi square tests creating multi-way splits. CART requires less data preparation compared to CHAID.

Understanding customer's actions

Keep a track of customer's actions like: what does he buy, when does he buy, why does he buy, what is the use of his buying, etc. Knowing such simple things about your customer will help you to understand needs of your customer better and thus process of data mining services will be easier and quality data would be mined. This will increase your personal relations with your customer which would finally result in a better professional relationship.

Following demography

Mine the data as per demography, dependent on geography as well as socio economic background of business location. You can use government statistics as the source of your data collection. Keeping it in mind you can go ahead with the understanding of the community existing and thus the data required.

Use your informal conversation in serving your clients better

Use minute details of your conversation and understanding with your customers to serve them. If essential, conduct surveys, send a professional gift or use some other object that helps you understand better in fulfilling customer needs. This will increase the bonding between you and your customer and you will be able to serve your customer better in providing data mining services.

Insert the collect information in a desktop database. More the information is collected you will find that you can prepare specific templates in feeding information. Using a desktop database, it is easier to make changes later on as and when required.




Source: http://ezinearticles.com/?Data-Mining,-Not-Just-a-Method-But-a-Technique&id=5416129

Wednesday 18 September 2013

Time Saving and Money Saving Data Entry Services

If you have an organization than data-entry is definitely the section with which you have to deal. The main concern for any organization which hires data entry services is flexibility and value for money. People need services which provide fast accurate entry of any form of hand-written data.

Data entry is very straight forward work but requires enough man force. As a result, many companies prefer to outsource data entry services to offshore countries. Company just have to find reliable data-entry partner from offshore countries which provides accurate data-entry services at most affordable prices.

As competition grows, many data-entry firms from offshore countries gives the most competitive prices for data-entry services. Outsourcing is not a new concept and having vast market doing outsourcing work. If you are looking for outsourcing data-entry work than India is the best outsourcing destination.

Many firms in India has enough experience with data entry projects which gives the best possible data-entry solutions from advanced data-entry tools. Daily, number of companies wants to move their paper documents into electronic format. All these firms in offshore countries give data entry services from qualified and well trained data-entry professionals. Their experienced and professional team of data-entry is highly trained in handling and obtaining large quantities of data in the minimal time possible. Outsourcing data entry and document processing work will save your valuable time and money. Utilizing this time and money you will be able to concentrate on your more important parts of your business leads you to high profit in best time.

Effective policies leads business to continue progress and survive them in today's highly competitive market. As in many cases, non-core activities are creating headaches in the path of progress, it is also an essential to finish them accurately as they provide assistance to core business.

So with choosing outsourcing less important data-entry work as a business strategy, allow you to create more attention on your core business activities.

Author is related with offshore data entry services providing firm ServicesDataEntry.co.uk. Firm outsource data entry services like online data entry services and many more.




Source: http://ezinearticles.com/?Time-Saving-and-Money-Saving-Data-Entry-Services&id=2908114

Tuesday 17 September 2013

What's Your Excuse For Not Using Data Mining?

In an earlier article I briefly described how data mining and RFM analysis can help marketers be more efficient (read... increased marketing ROI!). These marketing analytics tools can significantly help with all direct marketing efforts (multichannel campaign management efforts using direct mail, email and call center) and some interactive marketing efforts as well. So, why aren't all companies using it today? Well, typically it comes down to a lack of data and/or statistical expertise. Even if you don't have data mining expertise, YOU can benefit from data mining by using a consultant. With that in mind, let's tackle the first problem -- collecting and developing the data that is useful for data mining.

The most important data to collect for data mining include:

oTransaction data - For every sale, you at least need to know the product and the amount and date of the purchase.

oPast campaign response data - For every campaign you've run, you need to identify who responded and who didn't. You may need to use direct and indirect response attribution.

oGeo-demographic data - This is optional, but you may want to append your customer file/database with consumer overlay data from companies like Acxiom.

oLifestyle data - This is also an optional append of indicators of socio-economic lifestyle that are developed by companies like Claritas. All of the above data may or may not exist in the same data source. Some companies have a single holistic view of the customer in a database and some don't. If you don't, you'll have to make sure all data sources that contain customer data have the same customer ID/key. That way, all of the needed data can be brought together for data mining.

How much data do you need for data mining? You'll hear many different answers, but I like to have at least 15,000 customer records to have confidence in my results.

Once you have the data, you need to massage it to get it ready to be "baked" by your data mining application. Some data mining applications will automatically do this for you. It's like a bread machine where you put in all the ingredients -- they automatically get mixed, the bread rises, bakes, and is ready for consumption! Some notable companies that do this include KXEN, SAS, and SPSS. Even if you take the automated approach, it's helpful to understand what kinds of things are done to the data prior to model building.

Preparation includes:

oMissing data analysis. What fields have missing values? Should you fill in the missing values? If so, what values do you use? Should the field be used at all?

oOutlier detection. Is "33 children in a household" extreme? Probably - and consequently this value should be adjusted to perhaps the average or maximum number of children in your customer's households.

oTransformations and standardizations. When various fields have vastly different ranges (e.g., number of children per household and income), it's often helpful to standardize or normalize your data to get better results. It's also useful to transform data to get better predictive relationships. For instance, it's common to transform monetary variables by using their natural logs.

oBinning Data. Binning continuous variables is an approach that can help with noisy data. It is also required by some data mining algorithms.

More to come on data mining for marketers in my next article.

Jim Stafford has worked for leading companies in the Marketing Automation space (BI, data mining, campaign management and eMarketing) for over 10 years. He has held roles of Director - Database Marketing Solutions, Pre-Sales Manager, Product Manager, and Solution Architect at companies like Aprimo, Group1 Software, SAS, Siebel, SPSS and Unica. Mr. Stafford has consistently helped sales teams meet or beat established sales targets. He was the principal pre-sales contributor to Siebel's second largest MA sale with General Motors. Jim has had considerable exposure to many verticals including: Financial Services, Hospitality & Entertainment, Automotive, Communications, and Utilities. He is a seasoned expert at discovery and knows key industry trends. Jim has an M.A. Degree in Economics from the University of Maryland and has been a frequent speaker at annual National Center for Database Marketing and Direct Marketing Associations events. Visit [http://www.staffordsbsg.com/] to learn more about Jim and his company's services.




Source: http://ezinearticles.com/?Whats-Your-Excuse-For-Not-Using-Data-Mining?&id=3576029

Monday 16 September 2013

Choosing the Right Web Design Company

When you choose to create an online presence for your organization, you have to make crucial decisions for making your business dreams come true. Website design is an essential component for your online business presence. Therefore, it is vital that you choose the right Web Design firm to design your company's website.

If you reside somewhere in United Arab Emirates, you have a multitude of web design agencies to choose from. You need a web design firm that can work under the budget you offer and provide you with a quality web design that meets your business objectives. However, it's easy said than done. There are some steps that you need to take before selecting the right web design company to design or re-design your company's website.

1. Narrow down your choices

There are hundreds of Design firms in Dubai, Sharjah, Abu Dhabi and all the other states around UAE. You can't possibly check out every company's profile, therefore, you need to narrow down your list of choices. You can choose any criteria to narrow down your options; however, it is best that you scrape out all the companies that have not worked for any clients in your business domain. For instance, if you own a real estate agency and you want a firm to design your company's website then you should narrow down your selection by enlisting all the web design firms that have already designed websites for other real estate agencies.

2. Read Client portfolios and Testimonials

Now that you have a limited number of options to choose from, you need to check out their websites and read the testimonials their clients have written for them. Moreover, you can read their portfolio and find out about the type of work they have done in past. With this research, you can judge the abilities and capabilities of all the web design companies on your list and rate them accordingly.

3. Are they adept at latest trends and technologies?

After you have rated all the companies on your list, you need to find out if all of these companies are well-versed and up-to-dated with the modern-day web technologies. You can either have a direct conversation, to judge their potential, or you can read about their expertise online.

4. Shortlist companies that meet your budget

Don't let your limited budget get in the way of your dreams. Many web design companies in UAE charge unrealistically. However, if you can get compelling designs for a reasonable price that meet your budget then that's the best web design company for you. Try to negotiate the prices and most probably, it will work for you.

5. Look for a long-term relationship

The most important factor that you should consider is to look for a company that is a one-stop shop for all your present and future needs. You may need to get your website developed and promoted, afterwards. Therefore, it is recommended to have one firm to do it all for you rather then hiring different firms for different services.

The time you spend to take these few steps, before selecting the right web design company, may save you from making costly mistakes at a later stage. Your website's design is the spearhead of all your marketing tools and, therefore, it has paramount importance in order to attract, engage and retain visitors while win sales for your business.

The author has written a number of online business development articles. She has been working for a highly reputed Software Solution Provider, Elevation New Media, for years now and is an expert in providing useful tips to optimize websites.



Source: http://ezinearticles.com/?Choosing-the-Right-Web-Design-Company&id=2633958

Sunday 15 September 2013

Data Mining Services

You will get all solutions regarding data mining from many companies in India. You can consult a variety of companies for data mining services and considering the variety is beneficial to customers. These companies also offer web research services which will help companies to perform critical business activities.

Very competitive prices for commodities will be the results where there is competition among qualified players in the data mining, data collection services and other computer-based services. Every company willing to cut down their costs regarding outsourcing data mining services and BPO data mining services will benefit from the companies offering data mining services in India. In addition, web research services are being sourced from the companies.

Outsourcing is a great way to reduce costs regarding labor, and companies in India will benefit from companies in India as well as from outside the country. The most famous aspect of outsourcing is data entry. Preference of outsourcing services from offshore countries has been a practice by companies to reduce costs, and therefore, it is not a wonder getting outsource data mining to India.

For companies which are seeking for outsourcing services such as outsource web data extraction, it is good to consider a variety of companies. The comparison will help them get best quality of service and businesses will grow rapidly in regard to the opportunities provided by the outsourcing companies. Outsourcing does not only provide opportunities for companies to reduce costs but to get labor where countries are experiencing shortage.

Outsourcing presents good and fast communication opportunity to companies. People will be communicating at the most convenient time they have to get the job done. The company is able to gather dedicated resources and team to accomplish their purpose. Outsourcing is a good way of getting a good job because the company will look for the best workforce. In addition, the competition for the outsourcing provides a rich ground to get the best providers.

In order to retain the job, providers will need to perform very well. The company will be getting high quality services even in regard to the price they are offering. In fact, it is possible to get people to work on your projects. Companies are able to get work done with the shortest time possible. For instance, where there is a lot of work to be done, companies may post the projects onto the websites and the projects will get people to work on them. The time factor comes in where the company will not have to wait if it wants the projects completed immediately.

Outsourcing has been effective in cutting labor costs because companies will not have to pay the extra amount required to retain employees such as the allowances relating to travels, as well as housing and health. These responsibilities are met by the companies that employ people on a permanent basis. The opportunity presented by the outsourcing of data and services is comfort among many other things because these jobs can be completed at home. This is the reason why the jobs will be preferred more in the future.

To increase business effectiveness, productivity and workflow, you need quality and accurate data entry system. this unrivaled quality is provided by Data extraction services which has excellent track record in providing quality services.




Source: http://ezinearticles.com/?Data-Mining-Services&id=4733707

Friday 13 September 2013

Beneficial Data Collection Services

Internet is becoming the biggest source for information gathering. Varieties of search engines are available over the World Wide Web which helps in searching any kind of information easily and quickly. Every business needs relevant data for their decision making for which market research plays a crucial role. One of the services booming very fast is the data collection services. This data mining service helps in gathering relevant data which is hugely needed for your business or personal use.

Traditionally, data collection has been done manually which is not very feasible in case of bulk data requirement. Although people still use manual copying and pasting of data from Web pages or download a complete Web site which is shear wastage of time and effort. Instead, a more reliable and convenient method is automated data collection technique. There is a web scraping techniques that crawls through thousands of web pages for the specified topic and simultaneously incorporates this information into a database, XML file, CSV file, or other custom format for future reference. Few of the most commonly used web data extraction processes are websites which provide you information about the competitor's pricing and featured data; spider is a government portal that helps in extracting the names of citizens for an investigation; websites which have variety of downloadable images.

Aside, there is a more sophisticated method of automated data collection service. Here, you can easily scrape the web site information on daily basis automatically. This method greatly helps you in discovering the latest market trends, customer behavior and the future trends. Few of the major examples of automated data collection solutions are price monitoring information; collection of data of various financial institutions on a daily basis; verification of different reports on a constant basis and use them for taking better and progressive business decisions.

While using these service make sure you use the right procedure. Like when you are retrieving data download it in a spreadsheet so that the analysts can do the comparison and analysis properly. This will also help in getting accurate results in a faster and more refined manner.




Source: http://ezinearticles.com/?Beneficial-Data-Collection-Services&id=5879822

Thursday 12 September 2013

Why Outsourcing Data Mining Services?

Are huge volumes of raw data waiting to be converted into information that you can use? Your organization's hunt for valuable information ends with valuable data mining, which can help to bring more accuracy and clarity in decision making process.

Nowadays world is information hungry and with Internet offering flexible communication, there is remarkable flow of data. It is significant to make the data available in a readily workable format where it can be of great help to your business. Then filtered data is of considerable use to the organization and efficient this services to increase profits, smooth work flow and ameliorating overall risks.

Data mining is a process that engages sorting through vast amounts of data and seeking out the pertinent information. Most of the instance data mining is conducted by professional, business organizations and financial analysts, although there are many growing fields that are finding the benefits of using in their business.

Data mining is helpful in every decision to make it quick and feasible. The information obtained by it is used for several applications for decision-making relating to direct marketing, e-commerce, customer relationship management, healthcare, scientific tests, telecommunications, financial services and utilities.

Data mining services include:

    Congregation data from websites into excel database
    Searching & collecting contact information from websites
    Using software to extract data from websites
    Extracting and summarizing stories from news sources
    Gathering information about competitors business

In this globalization era, handling your important data is becoming a headache for many business verticals. Then outsourcing is profitable option for your business. Since all projects are customized to suit the exact needs of the customer, huge savings in terms of time, money and infrastructure can be realized.

Advantages of Outsourcing Data Mining Services:

    Skilled and qualified technical staff who are proficient in English
    Improved technology scalability
    Advanced infrastructure resources
    Quick turnaround time
    Cost-effective prices
    Secure Network systems to ensure data safety
    Increased market coverage

Outsourcing will help you to focus on your core business operations and thus improve overall productivity. So data mining outsourcing is become wise choice for business. Outsourcing of this services helps businesses to manage their data effectively, which in turn enable them to achieve higher profits.




Source: http://ezinearticles.com/?Why-Outsourcing-Data-Mining-Services?&id=3066061

Wednesday 11 September 2013

Web Data Extraction Services and Data Collection Form Website Pages

For any business market research and surveys plays crucial role in strategic decision making. Web scrapping and data extraction techniques help you find relevant information and data for your business or personal use. Most of the time professionals manually copy-paste data from web pages or download a whole website resulting in waste of time and efforts.

Instead, consider using web scraping techniques that crawls through thousands of website pages to extract specific information and simultaneously save this information into a database, CSV file, XML file or any other custom format for future reference.

Examples of web data extraction process include:
• Spider a government portal, extracting names of citizens for a survey
• Crawl competitor websites for product pricing and feature data
• Use web scraping to download images from a stock photography site for website design

Automated Data Collection
Web scraping also allows you to monitor website data changes over stipulated period and collect these data on a scheduled basis automatically. Automated data collection helps you discover market trends, determine user behavior and predict how data will change in near future.

Examples of automated data collection include:
• Monitor price information for select stocks on hourly basis
• Collect mortgage rates from various financial firms on daily basis
• Check whether reports on constant basis as and when required

Using web data extraction services you can mine any data related to your business objective, download them into a spreadsheet so that they can be analyzed and compared with ease.

In this way you get accurate and quicker results saving hundreds of man-hours and money!

With web data extraction services you can easily fetch product pricing information, sales leads, mailing database, competitors data, profile data and many more on a consistent basis.



Source: http://ezinearticles.com/?Web-Data-Extraction-Services-and-Data-Collection-Form-Website-Pages&id=4860417

Monday 9 September 2013

Collecting Data With Web Scrapers

There is a large amount of data available only through websites. However, as many people have found out, trying to copy data into a usable database or spreadsheet directly out of a website can be a tiring process. Data entry from internet sources can quickly become cost prohibitive as the required hours add up. Clearly, an automated method for collating information from HTML-based sites can offer huge management cost savings.

Web scrapers are programs that are able to aggregate information from the internet. They are capable of navigating the web, assessing the contents of a site, and then pulling data points and placing them into a structured, working database or spreadsheet. Many companies and services will use programs to web scrape, such as comparing prices, performing online research, or tracking changes to online content.

Let's take a look at how web scrapers can aid data collection and management for a variety of purposes.

Improving On Manual Entry Methods

Using a computer's copy and paste function or simply typing text from a site is extremely inefficient and costly. Web scrapers are able to navigate through a series of websites, make decisions on what is important data, and then copy the info into a structured database, spreadsheet, or other program. Software packages include the ability to record macros by having a user perform a routine once and then have the computer remember and automate those actions. Every user can effectively act as their own programmer to expand the capabilities to process websites. These applications can also interface with databases in order to automatically manage information as it is pulled from a website.

Aggregating Information

There are a number of instances where material stored in websites can be manipulated and stored. For example, a clothing company that is looking to bring their line of apparel to retailers can go online for the contact information of retailers in their area and then present that information to sales personnel to generate leads. Many businesses can perform market research on prices and product availability by analyzing online catalogues.

Data Management

Managing figures and numbers is best done through spreadsheets and databases; however, information on a website formatted with HTML is not readily accessible for such purposes. While websites are excellent for displaying facts and figures, they fall short when they need to be analyzed, sorted, or otherwise manipulated. Ultimately, web scrapers are able to take the output that is intended for display to a person and change it to numbers that can be used by a computer. Furthermore, by automating this process with software applications and macros, entry costs are severely reduced.

This type of data management is also effective at merging different information sources. If a company were to purchase research or statistical information, it could be scraped in order to format the information into a database. This is also highly effective at taking a legacy system's contents and incorporating them into today's systems.

Overall, a web scraper is a cost effective user tool for data manipulation and management.



Source: http://ezinearticles.com/?Collecting-Data-With-Web-Scrapers&id=4223877

Sunday 8 September 2013

Digitize Data With Data Processing Services

Unorganized data might cost you your numero UNO position in your domain. If you have well-organized data, it will not only be helpful in decision-making but will also guarantee a smooth flow of your business. If you are stuck with heaps of documents to be converted into electronic format. Then, outsourcing your files to a company providing Large Volume Data Processing Services is the most accurate and efficient option.

Data processing is the process in which computer programs and other processing systems are used to analyze, summarize and convert the data into an electronic format.

It involves a series of process which are: -

    Validation - This process checks that whether the entries are correct or not.
    Sorting - In this process, sorting is done either sequentially or in various sets.
    Summarize data - This process summarizes the data into main points.
    Aggregation - Combination of different fragments of records takes place in this process.
    Analysis - This process involves the analysis, interpretation and presentation of the collected and organized data.

Data processing companies have comprehensive knowledge about all the above mentioned steps and will provide a complete package of Large volume data processing services which includes: -

    Manual data entry
    Forms based data capture
    Full text data capture
    Digitization
    Document conversion
    Word Processing
    e-Book conversion
    Data extraction from web
    OCR- Optical character recognition

By outsourcing, you can get rid of large volumes of data pretty quickly and can lay more stress on core business activities.

You will have access to many other benefits like: -

    Heaps of cluttered and unorganized work will be organized, sorted and digitized.
    You can make use of neatly organized data to make informed business decisions.
    Chances of losing data will be scarce once it is digitized.
    You can do away with unwanted data and get access to relevant data.
    You can cut down the operating costs and need not incur any expenses in setting up infrastructure.
    You can get the data converted into a form of your choice.

Companies that deal with Large volume data processing services have the experience, expertise, manpower and technology to deliver results as per your expectations. They can handle your bulk of data easily and process it in your desired format within the deadline.

If you want your large volume of data to be digitized with accuracy and at cost-effective rates, choose an outsourcing company which has years of experience in providing Large volume data processing services. You just need to spend a few hours browsing on the net and then short-listing the prospectives. Once you are done with going through the portfolio of these firms and are contented with their information, you can negotiate the rate with them and stipulate the time.

This article about large volume data Processing services has been authored by Sam Efron. He is an experienced technical content writer from data-entry-india.com. With several years of experience and expertise of writing about Data Processing Services, he brings a seasoned maturity and knowledge to his articles.



Source: http://ezinearticles.com/?Digitize-Data-With-Data-Processing-Services&id=7963690

Friday 6 September 2013

Data Mining - Critical for Businesses to Tap the Unexplored Market

Knowledge discovery in databases (KDD) is an emerging field and is increasingly gaining importance in today's business. The knowledge discovery process, however, is vast, involving understanding of the business and its requirements, data selection, processing, mining and evaluation or interpretation; it does not have any pre-defined set of rules to go about solving a problem. Among the other stages, the data mining process holds high importance as the task involves identification of new patterns that have not been detected earlier from the dataset. This is relatively a broad concept involving web mining, text mining, online mining etc.

What Data Mining is and what it is not?

The data mining is the process of extracting information, which has been collected, analyzed and prepared, from the dataset and identifying new patterns from that information. At this juncture, it is also important to understand what it is not. The concept is often misunderstood for knowledge gathering, processing, analysis and interpretation/ inference derivation. While these processes are absolutely not data mining, they are very much necessary for its successful implementation.

The 'First-mover Advantage'

One of the major goals of the data mining process is to identify an unknown or rather unexplored segment that had always existed in the business or industry, but was overlooked. The process, when done meticulously using appropriate techniques, could even make way for niche segments providing companies the first-mover advantage. In any industry, the first-mover would bag the maximum benefits and exploit resources besides setting standards for other players to follow. The whole process is thus considered to be a worthy approach to identify unknown segments.

The online knowledge collection and research is the concept involving many complications and, therefore, outsourcing the data mining services often proves viable for large companies that cannot devote time for the task. Outsourcing the web mining services or text mining services would save an organization's productive time which would otherwise be spent in researching.

The data mining algorithms and challenges

Every data mining task follows certain algorithms using statistical methods, cluster analysis or decision tree techniques. However, there is no single universally accepted technique that can be adopted for all. Rather, the process completely depends on the nature of the business, industry and its requirements. Thus, appropriate methods have to be chosen depending upon the business operations.

The whole process is a subset of knowledge discovery process and as such involves different challenges. Analysis and preparation of dataset is very crucial as the well-researched material could assist in extracting only the relevant yet unidentified information useful for the business. Hence, the analysis of the gathered material and preparation of dataset, which also considers industrial standards during the process, would consume more time and labor. Investment is another major challenge in the process as it involves huge cost on deploying professionals with adequate domain knowledge plus knowledge on statistical and technological aspects.

The importance of maintaining a comprehensive database prompted the need for data mining which, in turn, paved way for niche concepts. Though the concept has been present for years now, companies faced with ever growing competition have realized its importance only in the recent years. Besides being relevant, the dataset from where the information is actually extracted also has to be sufficient enough so as to pull out and identify a new dimension. Yet, a standardized approach would result in better understanding and implementation of the newly identified patterns.



Source: http://ezinearticles.com/?Data-Mining---Critical-for-Businesses-to-Tap-the-Unexplored-Market&id=6745886

Thursday 5 September 2013

Web Data Extraction

The Internet as we know today is a repository of information that can be accessed across geographical societies. In just over two decades, the Web has moved from a university curiosity to a fundamental research, marketing and communications vehicle that impinges upon the everyday life of most people in all over the world. It is accessed by over 16% of the population of the world spanning over 233 countries.

As the amount of information on the Web grows, that information becomes ever harder to keep track of and use. Compounding the matter is this information is spread over billions of Web pages, each with its own independent structure and format. So how do you find the information you're looking for in a useful format - and do it quickly and easily without breaking the bank?

Search Isn't Enough

Search engines are a big help, but they can do only part of the work, and they are hard-pressed to keep up with daily changes. For all the power of Google and its kin, all that search engines can do is locate information and point to it. They go only two or three levels deep into a Web site to find information and then return URLs. Search Engines cannot retrieve information from deep-web, information that is available only after filling in some sort of registration form and logging, and store it in a desirable format. In order to save the information in a desirable format or a particular application, after using the search engine to locate data, you still have to do the following tasks to capture the information you need:

· Scan the content until you find the information.

· Mark the information (usually by highlighting with a mouse).

· Switch to another application (such as a spreadsheet, database or word processor).

· Paste the information into that application.

Its not all copy and paste

Consider the scenario of a company is looking to build up an email marketing list of over 100,000 thousand names and email addresses from a public group. It will take up over 28 man-hours if the person manages to copy and paste the Name and Email in 1 second, translating to over $500 in wages only, not to mention the other costs associated with it. Time involved in copying a record is directly proportion to the number of fields of data that has to copy/pasted.

Is there any Alternative to copy-paste?

A better solution, especially for companies that are aiming to exploit a broad swath of data about markets or competitors available on the Internet, lies with usage of custom Web harvesting software and tools.

Web harvesting software automatically extracts information from the Web and picks up where search engines leave off, doing the work the search engine can't. Extraction tools automate the reading, the copying and pasting necessary to collect information for further use. The software mimics the human interaction with the website and gathers data in a manner as if the website is being browsed. Web Harvesting software only navigate the website to locate, filter and copy the required data at much higher speeds that is humanly possible. Advanced software even able to browse the website and gather data silently without leaving the footprints of access.

The next article of this series will give more details about how such softwares and uncover some myths on web harvesting.



Source: http://ezinearticles.com/?Web-Data-Extraction&id=575212

Wednesday 4 September 2013

Data Mining Process - Why Outsource Data Mining Service?

Overview of Data Mining and Process:
Data mining is one of the unique techniques for investigating information to extract certain data patterns and decide to outcome of existing requirements. Data mining is widely use in client research, services analysis, market research and so on. It is totally based on mathematical algorithm and analytical skills to drive the desired results from the huge database collection.

Information mining is mostly used by financial analyzer, business and professional organization and also there are many growing area of business that are get maximum advantages of data extract with use of data warehouses in their small to large level of businesses.

Most of functionalities which are used in information collecting process define as under:

* Retrieving Data

* Analyzing Data

* Extracting Data

* Transforming Data

* Loading Data

* Managing Databases

Most of small, medium and large levels of businesses are collect huge amount of data or information for analysis and research to develop business. Such kind of large amount will help and makes it much important whenever information or data required.

Why Outsource Data Online Mining Service?

Outsourcing advantages of data mining services:
o Almost save 60% operating cost
o High quality analysis processes ensuring accuracy levels of almost 99.98%
o Guaranteed risk free outsourcing experience ensured by inflexible information security policies and practices
o Get your project done within a quick turnaround time
o You can measure highly skilled and expertise by taking benefits of Free Trial Program.
o Get the gathered information presented in a simple and easy to access format

Thus, data or information mining is very important part of the web research services and it is most useful process. By outsource data extraction and mining service; you can concentrate on your co relative business and growing fast as you desire.

Outsourcing web research is trusted and well known Internet Market research organization having years of experience in BPO (business process outsourcing) field.

If you want to more information about data mining services and related web research services, then contact us.



Source: http://ezinearticles.com/?Data-Mining-Process---Why-Outsource-Data-Mining-Service?&id=3789102

Some of the Main Techniques For Data Mining

Data mining is the process of extracting relationships from large data sets. It is an area of Computer Science that has received significant commercial interest. In this article I will detail a few of the most common methods of data mining analysis.

Association rule discovery: Association rule discovery methods are used to extract associations from data sets. Traditionally, the technique was developed on supermarket purchase data. An association rule is a rule of the form X -> Y. An example of this may be "If a customer purchases milk this implies (->) that the customer will also purchase bread". An association rule has associated with it a support and a confidence value. The support is the percentage of all entries (or transactions in this case) that have all the items. For example, the percentage of all transactions in which milk and bread were purchased. The confidence is the percentage of the transactions that satisfy the left hand side of the rule that also satisfy the right hand side of the rule. For example, in this case, the confidence would be the percentage of purchases that purchased milk which also purchased bread. Association discovery methods will extract all possible association rules from a data set for which the user has specified a minimum support and confidence.

Cluster Analysis: Cluster analysis is the process of taking one or more numerical fields and assigning clusters their values. These clusters represent groups of points which are close to each other. For example, if you watch a documentary on space, you will see that galaxies contain a lot of stars and planets. There are many galaxies in space, however the stars and planets all occur in clusters that are the galaxies. That is, the stars and planets are not randomly located in space but are clumped together in groups that are galaxies. A cluster analysis method is used to find these sorts of groups. If a cluster analysis method was applied to the stars in space, it may find that each galaxy is a cluster and assign a unique cluster identification to each star in a given galaxy. This cluster identification then becomes another field in the data set and can be used in further data mining analysis. For example, you might use a cluster id field to form association rules to other fields in the data set.

Decision Trees: Decision trees are used to form a tree of decisions in a data set to help predict a value. For example, if you were looking at a data set that was used to predict weather a potential loan applicant would be a credit risk, a tree of decisions would be formed based on factors in the data set. The tree may contain decisions such as whether the applicant had defaulted on a loan before, the age of the applicant, whether the applicant was employed or not, the applicants income and the total repayments on the loan. You could then follow this tree of decisions to say for example, if an applicant has never defaulted on a loan before, the applicant is employed, their income is in the top 15 percentile for the country and the loan amount relatively low then there is a very low risk of default.

These are some of the more common techniques for data mining analysis amongst a large group of data mining techniques that a commonly applied to analyzing large data sets. These techniques have proved beneficial to gather useful information and relationships from data that may otherwise be too large to interpret well.



Source: http://ezinearticles.com/?Some-of-the-Main-Techniques-For-Data-Mining&id=4210436

Monday 2 September 2013

Scraping LinkedIn Public Profiles for Fun and Profit

Reconnaissance and Information Gathering is a part of almost every penetration testing engagement. Often, the tester will only perform network reconnaissance in an attempt to disclose and learn the company's network infrastructure (i.e. IP addresses, domain names, and etc), but there are other types of reconnaissance to conduct, and no, I'm not talking about dumpster diving. Thanks to social networks like LinkedIn, OSINT/WEBINT is now yielding more information. This information can then be used to help the tester test anything from social engineering to weak passwords.

In this blog post I will show you how to use Pythonect to easily generate potential passwords from LinkedIn public profiles. If you haven't heard about Pythonect yet, it is a new, experimental, general-purpose dataflow programming language based on the Python programming language. Pythonect is most suitable for creating applications that are themselves focused on the "flow" of the data. An application that generates passwords from the employees public LinkedIn profiles of a given company - have a coherence and clear dataflow:

(1) Find all the employees public LinkedIn profiles → (2) Scrap all the employees public LinkedIn profiles → (3) Crunch all the data into potential passwords

Now that we have the general concept and high-level overview out of the way, let's dive in to the details.

Finding all the employees public LinkedIn profiles will be done via Google Custom Search Engine, a free service by Google that allows anyone to create their own search engine by themselves. The idea is to create a search engine that when searching for a given company name - will return all the employees public LinkedIn profiles. How? When creating a Google Custom Search Engine it's possible to refine the search results to a specific site (i.e. 'Sites to search'), and we're going to limit ours to: linkedin.com. It's also possible to fine-tune the search results even further, e.g. uk.linkedin.com to find only employees from United Kingdom.

The access to the newly created Google Custom Search Engine will be made using a free API key obtained from Google API Console. Why go through the Google API? because it allows automation (No CAPTCHA's), and it also means that the search-result pages will be returned as JSON (as oppose to HTML). The only catch with using the free API key is that it's limited to 100 queries per day, but it's possible to buy an API key that will not be limited.

Scraping the profiles is a matter of iterating all over the hCards in all the search-result pages, and extracting the employee name from each hCard. Whats is a hCard? hCard is a micro format for publishing the contact details of people, companies, organizations, and places. hCard is also supported by social networks such as Facebook, Google+, LinkedIn and etc. for exporting public profiles. Google (when indexing) parses hCard, and when relevant, uses them in search-result pages. In other words, when search-result pages include LinkedIn public profiles, it will appear as hCards, and could be easily parsed.

Let's see the implementation of the above:

#!/usr/bin/python
#
# Copyright (C) 2012 Itzik Kotler
#
# scraper.py is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# scraper.py is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with scraper.py.  If not, see <http://www.gnu.org/licenses/>.

"""Simple LinkedIn public profiles scraper that uses Google Custom Search"""

import urllib
import simplejson


BASE_URL = "https://www.googleapis.com/customsearch/v1?key=<YOUR GOOGLE API KEY>&cx=<YOUR GOOGLE SEARCH ENGINE CX>"


def __get_all_hcards_from_query(query, index=0, hcards={}):

    url = query

    if index != 0:

        url = url + '&start=%d' % (index)

    json = simplejson.loads(urllib.urlopen(url).read())

    if json.has_key('error'):

        print "Stopping at %s due to Error!" % (url)

        print json

    else:

        for item in json['items']:

            try:

                hcards[item['pagemap']['hcard'][0]['fn']] = item['pagemap']['hcard'][0]['title']

            except KeyError as e:

                pass

        if json['queries'].has_key('nextPage'):

            return __get_all_hcards_from_query(query, json['queries']['nextPage'][0]['startIndex'], hcards)

    return hcards


def get_all_employees_by_company_via_linkedin(company):

    queries = ['"at %s" inurl:"in"', '"at %s" inurl:"pub"']

    result = {}

    for query in queries:

        _query = query % company

        result.update(__get_all_hcards_from_query(BASE_URL + '&q=' + _query))

    return list(result)
Replace <YOUR GOOGLE API KEY> and <YOUR GOOGLE SEARCH ENGINE CX> in the code above with your Google API Key and Google Search Engine CX respectively, save it to a file called scraper.py, and you're ready!

To kick-start, here is a simple program in Pythonect (that utilizes the scraper module) that searchs and prints all the Pythonect company employees full names:
   
"Pythonect" -> scraper.get_all_employees_by_company_via_linkedin -> print
The output should be:
   
Itzik Kotler
In my LinkedIn Profile, I have listed Pythonect as a company that I work for, and since no one else is working there, when searching for all the employees of Pythonect company - only my LinkedIn profile comes up.
For demonstration purposes I will keep using this example (i.e. "Pythonect" company, and "Itzik Kotler" employee), but go ahead and replace Pythonect with other, more popular, companies names and see the results.

Now that we have a working skeleton, let's take its output and start crunching it. Keep in mind that every "password generation forumla" is merely a guess. The examples below are only a sampling of what can be done. There are, obviously many more possibilities and you are encouraged to experiment. But first, let's normalize the output - this way it's going to be consistent before operations are performed on it:

   
"Pythonect" -> scraper.get_all_employees_by_company_via_linkedin -> string.lower(''.join(_.split()))
The normalization procedure is short and simple: convert the string to lowercase and remove any spaces, and so the output should be now:

   
itzikkotler
As for data manipulation, out of the box (Thanks to The Python Standard Library) we've got itertools and it's combinatoric generators. Let's start by applying itertools.product:

   
"Pythonect" -> scraper.get_all_employees_by_company_via_linkedin -> string.lower(''.join(_.split())) -> itertools.product(_, repeat=4) -> print
The code above will generate and print every 4 characters password from the letters: i, t, z, k, o, t, l , e, r. However, it won't cover passwords with uppercase letters in it. And so, here's a simple and straightforward implementation of a cycle_uppercase function that cycles the input letters yields a copy of the input with letter in uppercase:
   
def cycle_uppercase(i):
    s = ''.join(i)
    for idx in xrange(0, len(s)):
        yield s[:idx] + s[idx].upper() + s[idx+1:]
To use it, save it to a file called itertools2.py, and then simply add it to the Pythonect program after the itertools.product(_, repeat=4) block, as follows:
   
"Pythonect" -> scraper.get_all_employees_by_company_via_linkedin \
    -> string.lower(''.join(_.split())) \
        -> itertools.product(_, repeat=4) \
            -> itertools2.cycle_uppercase \
                -> print
Now, the program will also cover passwords that include a single uppercase letter in it. Moving on with the data manipulation, sometimes the password might contain symbols that are not found within the scrapped data. In this case, it is necessary to build a generator that will take the input and add symbols to it. Here is a short and simple generator implemented as a Generator Expression:
1
   
[_ + postfix for postfix in ['123','!','$']]
To use it, simply add it to the Pythonect program after the itertools2.cycle_uppercase block, as follows:
   
"Pythonect" -> scraper.get_all_employees_by_company_via_linkedin \
    -> string.lower(''.join(_.split())) \
        -> itertools.product(_, repeat=4) \
            -> itertools2.cycle_uppercase \
                -> [_ + postfix for postfix in ['123','!','$']] \
                    -> print
The result is that now the program adds the strings: '123', '!', and '$' to every generated password, which increases the chances of guessing the user's right password, or not, depends on the password :)

To summarize, it's possible to take OSINT/WEBINT data on a given person or company and use it to generate potential passwords, and it's easy to do with Pythonect. There are, of course, many different ways to manipulate the data into passwords and many programs and filters that can be used. In this aspect, Pythonect being a flow-oriented language makes it easy to experiment and research with different modules and programs in a "plug and play" manner.



Source: http://blog.ikotler.org/2012/12/scraping-linkedin-public-profiles-for.html

Sunday 1 September 2013

Basics of Online Web Research, Web Mining & Data Extraction Services

The evolution of the World Wide Web and Search engines has brought the abundant and ever growing pile of data and information on our finger tips. It has now become a popular and important resource for doing information research and analysis.

Today, Web research services are becoming more and more complicated. It involves various factors such as business intelligence and web interaction to deliver desired results.

Web Researchers can retrieve web data using search engines (keyword queries) or browsing specific web resources. However, these methods are not effective. Keyword search gives a large chunk of irrelevant data. Since each webpage contains several outbound links it is difficult to extract data by browsing too.

Web mining is classified into web content mining, web usage mining and web structure mining. Content mining focuses on the search and retrieval of information from web. Usage mining extract and analyzes user behavior. Structure mining deals with the structure of hyperlinks.

Web mining services can be divided into three subtasks:

Information Retrieval (IR): The purpose of this subtask is to automatically find all relevant information and filter out irrelevant ones. It uses various Search engines such as Google, Yahoo, MSN, etc and other resources to find the required information.

Generalization: The goal of this subtask is to explore users' interest using data extraction methods such as clustering and association rules. Since web data are dynamic and inaccurate, it is difficult to apply traditional data mining techniques directly on the raw data.

Data Validation (DV): It tries to uncover knowledge from the data provided by former tasks. Researcher can test various models, simulate them and finally validate given web information for consistency.




Source: http://ezinearticles.com/?Basics-of-Online-Web-Research,-Web-Mining-and-Data-Extraction-Services&id=4511101