How Edge Data Center Providers are Changing the Internet’s Geography #top #data #center #providers


#

How Edge Data Center Providers are Changing the Internet’s Geography

Yevgeniy Sverdlik on August 26, 2015

As people watch more and more of their video online versus cable TV or satellite services, and as businesses consume more and more cloud services versus buying hardware boxes and software licenses, the physical nature of internet infrastructure is changing. Whether your drug is Bloodline on Netflix or Duck Dynasty on Amazon Prime, the web content companies go to great lengths to make sure you can binge in high def. The same goes for cloud services, for whom performance on the user’s end means everything in their busy, competitive market.

In recent years, the big push has been to improve the quality of these high-bandwidth web services to users outside of the top metros like New York, Los Angeles, or San Francisco. And the best way to do it has been caching the most popular content or web-application data on servers closer to the so-called “tier-two markets,” places like Phoenix, Minneapolis, or St. Paul. This push has created a whole new category of data center service providers that call their facilities “edge data centers.” These are facilities that quite literally extend the “edge” of the internet further from the traditional internet hubs in places like New York, Northern Virginia, Dallas, or Silicon Valley.

Examples of companies that describe themselves as edge data center providers include EdgeConneX. vXchnge. and 365 Data Centers. Building something that can be truly called an “edge data center” requires a different set of considerations than building your standard colocation facility. It’s about creating interconnection ecosystems in cities away from the traditional core markets.

Recommended: Edge Data Center Firm Names XO Comms Exec CTO

That EdgeConneX went from zero data centers two years ago to two dozen today and growing. and that vXchnge bought eight Sungard data centers in tier-two markets in one go earlier this year illustrates just how quickly demand for edge data centers is growing.

The Edge is a Place

Ultimately, location is the main way for companies like EdgeConneX to differentiate from the big colo players like Equinix or Interxion. Edge data center providers are essentially building in tier-two markets what Equinix and its rivals have built in the big core markets: hubs where all the players in the long chain of delivering content or services to customers interconnect and exchange traffic. These hubs are where most of the internet has lived and grown for the bulk of its existence, and edge data center companies are building smaller hubs in places that don’t already have them but are becoming increasingly bandwidth-hungry.

Take for example CloudFlare, a Content Delivery Network and internet security services company. To host its infrastructure in all the top markets, CloudFlare uses the big colos, including Equinix, Interxion, and TelecityGroup, which Equinix recently acquired. But it also recently became an EdgeConneX customer, because EdgeConneX could extend its network to places where the likes of Equinix don’t have data centers, Joshua Motta, director of special projects at CloudFlare, said.

You don’t see Equinix facilities in places like Phoenix, Las Vegas, or Minneapolis, he said. The distinction isn’t as clear-cut as it may seem at first, however. While it’s true that Equinix doesn’t have facilities in those three markets specifically, it does have presence in other cities that can be considered second-tier markets: places like Boston, Denver, and Philadelphia. But the size of Equinix data centers in those markets is very small compared to its marquee facilities in Silicon Valley, New York, London, or Frankfurt. Compare, for example, the nearly half a million square feet across seven Equinix data centers in the New York metro to 13,000 square feet in a single facility in Boston, and you’ll get the idea.

Transport Versus Peering: a Question of Cost

Boston is also an example of a market without a good data center option for CloudFlare, Motta said. There are plenty of data centers in town, and there is even an internet exchange, but there isn’t a data center where transit providers (companies that carry traffic over long distances), access networks (the home or business internet service providers), and content companies come together and interconnect, he said. It’s possible that this problem is specific to CloudFlare, he warned, since the company prefers not to pay for peering with access networks.


Security Risks of FTP and Benefits of Managed File Transfer #deep #web #search #engine, #hacker


#

File transfer services such as FTP or HTTP has been the most common way of file transfer for business requirements. Typically what a file transfer means is that a file transfer protocol such as FTP or HTTP is used to send the stream of bits stored as a single unit in a file system including file name, file size, timestamp and other metadata from one host to another host over a TCP-based network such as the Internet.

But this process is not foolproof. FTP, by itself, is not a secure file transfer protocol and it has a lot of security vulnerabilities. It s a known fact that FTP doesn t provide any encryption for data transfer. Most of the times, the requirement in any business is pretty simple: to transfer files between two endpoints in different locations, and the parties involved do not think much about how secure the file transfer process is going to be.

Using FTP for official file transfer can leave your data transmission exposed to many security attacks:

FTP Bounce Attack

Generally a file transfer happens when the source FTP server sends the data to the client which transmits the data to the destination FTP server. When there s a slow network connection, people often resort to using a proxy FTP which makes the client instructs the data transmission directly between two FTP servers. A hacker can take advantage of this type of file transfer and use a PORT command to request access to ports by posing as a middle man for the file transfer request; then execute port scans on hosts discreetly and gain access data transmitted over the network.

FTP Brute Force Attack

An attacker can carry out a brute force attack to guess the FTP server password by implementing a means to repeatedly try different password combinations until they can succeed in the break-in. A weak password and repeated use of the same password for multiple FTP servers can also help the hacker gain quick access. Once the password is guessed, your data is exposed.

Packet Capture (or Sniffing)

Because the data transfer via FTP is in clear text, any sensitive information such as usernames, passwords can be easily read network packet capture techniques such as packet sniffing. A packet sniffer is just a piece of computer program which can capture transmitted data packets and decode the packet’s raw data exposing data contained in the various fields of the packet.

When we restrict access to FTP servers based on the network address, it is possible that a cyber-criminal can use an external computer and assume the host address of a computer on the enterprise network, and download files during data transfer.

When operating systems assign dynamic port numbers in a particular order or pattern, an attacker easily decodes the pattern and identify the next port number which will be used. By illegally gaining access to a port number, the legitimate client trying to access the file will be denied and the hacker can steal files, or even insert a forged file or malicious file into the data stream which will be accessed by other legitimate users in the organization.

As we discussed above, there are a lot of devious means to intercepting an FTP-based file transfer and the chances of your data being exposed is also high. Networks that adhere to federal compliance norms such as PCI DSS, HIPAA, GLBA, etc. and those agencies and institutions that share government data, and customer records are at high risk if they just depend on FTP for file transfer. So, what s the optimum solution if not FTP?

Managed File Transfer Remedies the Vulnerabilities in FTP

Managed file transfer (MFT) is the best option for file transfer compared to all other file sharing methods such as using FTP, HTTP, TFTP, peer-to-peer file sharing and cloud drives. A managed file transfer server facilitates secure file transfer through the Internet by providing a high level of data security. The MFT server software provides secure internal, external and ad-hoc file transfers for both pull-based and push-based file transfers.

Though MFT also uses FTP for data transfer, this type of file transfer ensures the data is protected by using secure FTP (FTPS, SFTP, etc.) With B2B file transfers, especially in a DMZ environment when internal IP addresses need to be concealed, MFT server s authentication and data encryption methods help ensure secure, reliable and auditable file transfer.

MFT is widely used for securely transferring files over public or private networks and you can:

  • Perform secure file transfer via FTP, FTPS, SFTP, HTTP and HTTPS over IPv4 or IPv6 networks
  • Carry out ad hoc file transfer
  • Monitor the file transfer process in real time
  • Get notified of the status once the transfer is complete
  • Report on transfer activity and user access
  • Limit MFT access by user role and integration with Active Directory
  • Transport large wiles with integrity checks and protocol fidelity

When the secure file transfer is concerned at an organizational level, MFT server is the best option that ensures both security and endpoint management simplicity when compared to FTP.

Guest Post by:Vinod Mohan, Product Marketing Specialist Team Lead at SolarWinds with technical expertise in IT management and operations spanning IT security, SIEM, network management, application, systems, storage Virtualization management.


Operation Migration #data #migration


#

Aircraft Guided
bird migration

To learn more
watch our latest video.
WATCH VIDEO

Operation Migration has played a lead role in the reintroduction of endangered Whooping cranes into eastern North America. In the 1940s the species was reduced to just 15 birds.

Between 2001 – 2015, Operation Migration pilots used ultralight aircraft and played the role of surrogate parents to guide captive-hatched and imprinted Whooping cranes along a planned migration route, which began in Wisconsin and ended in Florida.

In early 2016, the U.S. Fish and Wildlife Service determined that the aircraft-guided method was too “artificial” and that cranes raised by costumed handlers, missed early learning opportunities. As a result, it was speculated that they did not properly nurture or protect their chicks when they had their own offspring. It was suggested that this inattentiveness was the cause of high pre-fledge mortality at the Necedah National Wildlife Refuge.

The U.S. Fish and Wildlife Service exercised its authority over the Endangered Species Act and ended the aircraft-guided reintroduction method.

Going Forward… Operation Migration remains committed to the eastern flock. With just over 100 Whooping cranes in the flock, OM will assist with the release of the 2016 Parent Reared crane chicks and track their fall migration. Our staff will monitor nesting birds in the spring to help determine the cause of chick mortality, by documenting the movements and behavior of pairs with chicks. We will use ground observations and, radio telemetry to gather data.

We will also capture cranes for transmitter replacement because as we all know; batteries don’t last forever. We will also stream live video via our CraneCam from the White River Marsh State Wildlife Area beginning in April. As always, we will report Whooping crane news to you in our In The Field blog.

Above: Jane Goodall shows her support for “Operation Migration!”

WE NEED
YOUR HELP!

Operation Migration relies on contributions from individuals and foundations to continue our work. You can help ensure the Whooping crane survives for future generations by calling 800-675-2618, or pledge your support right here!

Latest Posts

Saturday July 8th, 2017

Unfortunately, we re having some networking issues with the CraneCam. It will be offline until we get them.

Friday July 7th, 2017

Operation Migration began working closely with the Patuxent Wildlife Research Center back in 1995 when we first.

Thursday July 6th, 2017

The 2017 Whooping Crane Festival is just around the corner, and with it comes one of OM’s.


Online Store Solutions, Multi-vendor Multi-store B2B – B2C eCommerce #ecommerce #software,shopping #cart #software,auction #software,shopping #mall


#

Piecing together your eCommerce business:
The SMART way

At SmartWin, they make it possible to run B2B or B2C business in my own terms. Their unique eCommerce platform takes care of all the leg works of standard functions, leaving me a user-friendly template to simply adopt and customize.

Regardless of the size or type of your business, with SmartWin, you will find the Solution that is all you need. Our software scales from a handful number to millions of products.

SmartWin Technology has specialized in Microsoft IIS-based eCommerce solutions since 1997. Over the years we have developed a range of advanced technologies and software components that are adaptable, reliable, and scalable, with a proven track record. They are generic programs which can be customized to fit any applications.

Additional Services from SmartWin

Our unique SEO methodology is built on top of our own eCommerce software and can be applied to any Website using a third-party eCommerce program. The services utilize the ” franchise ” feature of our eCommerce platform. Together with some other advanced techniques, we will be able to build your site an automation of progressive search engine ranking improvements.

More reasons to employ SmartWin SEO automation.

Need a Web developer to improve your site or an admin to assist you managing your site? We are Microsoft specific and our custom programming covers ASP.NET 4.0, MVC 3.0, AJAX, JQuery, MS SQL 2008, LINQ to SQL, WCF and more.

Internet Marketing Services
Looking for an SEO service provider to progressively boost your Page Rank that actually works?
Read more.

Web Management Services
Need a Web developer to improve your site or an admin to assist you managing your site?
Read more.

Shopping Cart Software
All-in-one online/phone ordering system + real-time payment after-sales management. Adds to an existing site with a few lines of codes.
Read more.

Online Warehouse Builder
Got a large inventory and wanted to sell to different groups of customers? This database driven software handles every aspect of online sales. One database, many storefronts, multiple prices.
Read more.

Franchise Mall Builder
Offers a way to expand your business to a multi-vendor, multi-storefront B2B B2C shopping mall. Plus, franchise stores can run under their own domains.
Read more.

Copyright SmartWin Technology 1997 – 2012. All Rights Reserved.


What is tuple? Definition from #composite #software #data #virtualization


#

tuple

1) In programming languages, such as Lisp. Python. Linda, and others, a tuple (pronounced TUH-pul ) is an ordered set of values. The separator for each value is often a comma (depending on the rules of the particular language). Common uses for the tuple as a data type are (1) for passing a string of parameters from one program to another, and (2) representing a set of value attributes in a relational database. In some languages, tuples can be nested within other tuples within parentheses or brackets or other delimiters. Tuples can contain a mixture of other data types.

Here’s an example of a tuple that emphasizes the different data types that may exist within a tuple data type:

The above example is sometimes referred to as a 4-tuple, since it contains four values. An n-tuple would be one with an indeterminate or unspecified number of values.

2) A tuple is analogous to a record in nonrelational databases.

The term originated as an abstraction of the sequence: single, double, triple, quadruple, quintuple. n-tuple. Tuple is used in abstract mathematics to denote a multidimensional coordinate system.

This was last updated in April 2005

Related Terms

COBOL (Common Business Oriented Language) COBOL (Common Business-Oriented Language) is a programming language created in 1960, and was the first to run on multiple. See complete definition DevOps In its most broad meaning, DevOps is an operational philosophy that promotes better communication between development and. See complete definition DNN Platform (DotNetNuke) DNN Platform, formerly called DotNetNuke Community Edition, is a free, open source content management system (CMS). See complete definition


5 Reasons to Use a Software Load Balancer – DZone Performance #programming, #software #development, #devops,


#

5 Reasons to Use a Software Load Balancer

5 Reasons to Use a Software Load Balancer

Evolve your approach to Application Performance Monitoring by adopting five best practices that are outlined and explored in this e-book. brought to you in partnership with BMC .

Today, computer and internet usage is at an all-time high, and reliable performance is both necessary and critical for businesses of all sizes. To increase the speed of system loading, decrease downtime, and eliminate single points of failure, load balancing is the answer.Load balancers help provide the seamless experience that users desire. Well-designed infrastructure includes a good load balancer plan so that any potential failures are detected, requests are rerouted to redundant points, and users never notice any failures.

Until very recently, load balancing was heavily dependent on hardware; but that has all changed. With load balancing software, these tasks are done smoothly and automatically. In fact, there are a number of reasons to choose load balancing software.

1. Less Expensive

Deploying software is much less expensive than buying hardware every time a change is made. Replacing hardware with load balancing software is DevOps-friendly and eliminates the siloing between DevOps and the rest of the departments within a business. It puts application management squarely in the hands of those best able to handle it. Additionally, maintenance can be done anytime, anywhere.

2. Scalable

Software load balancing is a natural choice for achieving high availability that is sustainable as the business and infrastructure grow. Also, having at least two backend servers maintains high availability, with software load balancers ensuring that traffic is directed to the server that is more readily available.

3. Easier Maintenance

This is one of the main reasons a software load balancer is a better choice than a hardware-based application delivery controller (ADC). In fact, performance is often a serious issue with legacy ADCs. Load balancing software can run anywhere, and any upgrades or maintenance can be done from a variety of devices – PCs, tablets, or even smartphones.

4. Flexible

Migrating old, hardware-based infrastructure to cloud-based environments allows agile development and the ability to upgrade and refine features easily. Software load balancers can be deployed anywhere. They work easily in both cloud and virtual environments and have open APIs so they can be integrated with all the tools you already use. Simply download and configure the software – no expensive hardware required.

5. Faster

Nobody likes features that are buggy or underperform. We expect things to work right the first time and every time after that. In our increasingly digital world, we want instant responses and fast load times. Software load balancers will run fast in any environment. There are no hardware configuration limitations and you can scale infrastructure to the size you need. Load balancing software gives you the power to manage delivery effectively for optimal performance.

Software load balancing use is growing rapidly, and it will continue to grow and be refined further as time goes by. We are already seeing huge organizations use load balancing software, with the Amazon load balancer, Elastic Load Balancing (ELB), one of the most popular examples.


Training – Placement #skillexam #(skill #assessment #tool), #offshore #development #services, #recruitment #process #outsourcing #service, #traning


#

Dear Prospective Training Candidates,

CBS Information Systems, Inc. is fast growing software development and training company offering mission critical solutions to businesses through cutting-edge technologies since year 2000.

We are in the process of accepting candidates for various training programs. The training can be taken in class room style or remote (online) with live instructor.

If you are qualified, available, interested, planning to make a change, please RESPOND IMMEDIATELY. In considering candidates, time is of the essence, so please respond ASAP.

Here are our offerings:

  • Certified Trainer with real time experience.
  • Unlimited Lab Access during non-training hours.
  • Aggressive Placement Assistance.CBS will assist in job placement.
  • Fee Reimbursement upon successful placement by CBS.
  • No contract.
  • Real time Project Exercises and Step by Step Procedures with handouts.
  • Assistance in Resume and Interview Preparation.
  • Training from hands on consultancy experience.
  • Talk with experienced real time consultant available for comments and guidance.
  • Open for candidates with any visa status.
  • We provide the best possible trainer for each course
  • We offer most competitive pricing for our training programs.
  • We offer 100% online training with live instructors
  • You get trained from comfort of your place
  • We use state of the art learning management system
  • You are also connected via phone with live instructor during the class in a conference fashion

10 emerging technologies for Big Data #big #data #technology #stack


#

10 emerging technologies for Big Data

I’ve recently had the opportunity to have a conversation with Dr. Satwant Kaur on the topic of Big Data (see my previous interview with Dr. Kaur, “The 10 traits of the smart cloud “). Dr. Kaur has an extensive history in IT, being the author of Intel’s Transitioning Embedded Systems to Intelligent Environments. Her professional background, which includes four patents while at Intel CA, 20 distinguished awards, ten keynote conference speeches at IEEE, and over 50 papers and publications, has earned her the nickname, “The First Lady of Emerging Technologies.” Dr. Kaur will be delivering the keynote at the CES show: 2013 IEEE International Conference on Consumer Electronics (ICCE).

While the topic of Big Data is broad and encompasses many trends and new technology developments, she managed to give me a very good overview of what she considers to be the top ten emerging technologies that are helping users cope with and handle Big Data in a cost-effective manner.

Dr. Kaur:

Traditional, row-oriented databases are excellent for online transaction processing with high update speeds, but they fall short on query performance as the data volumes grow and as data becomes more unstructured. Column-oriented databases store data with a focus on columns, instead of rows, allowing for huge data compression and very fast query times. The downside to these databases is that they will generally only allow batch updates, having a much slower update time than traditional models.

Schema-less databases, or NoSQL databases

There are several database types that fit into this category, such as key-value stores and document stores, which focus on the storage and retrieval of large volumes of unstructured, semi-structured, or even structured data. They achieve performance gains by doing away with some (or all) of the restrictions traditionally associated with conventional databases, such as read-write consistency, in exchange for scalability and distributed processing.

This is a programming paradigm that allows for massive job execution scalability against thousands of servers or clusters of servers. Any MapReduce implementation consists of two tasks:

  • The “Map” task, where an input dataset is converted into a different set of key/value pairs, or tuples;
  • The “Reduce” task, where several of the outputs of the “Map” task are combined to form a reduced set of tuples (hence the name).

Hadoop

Hadoop is by far the most popular implementation of MapReduce, being an entirely open source platform for handling Big Data. It is flexible enough to be able to work with multiple data sources, either aggregating multiple sources of data in order to do large scale processing, or even reading data from a database in order to run processor-intensive machine learning jobs. It has several different applications, but one of the top use cases is for large volumes of constantly changing data, such as location-based data from weather or traffic sensors, web-based or social media data, or machine-to-machine transactional data.

Hive is a “SQL-like” bridge that allows conventional BI applications to run queries against a Hadoop cluster. It was developed originally by Facebook, but has been made open source for some time now, and it’s a higher-level abstraction of the Hadoop framework that allows anyone to make queries against data stored in a Hadoop cluster just as if they were manipulating a conventional data store. It amplifies the reach of Hadoop, making it more familiar for BI users.

PIG is another bridge that tries to bring Hadoop closer to the realities of developers and business users, similar to Hive. Unlike Hive, however, PIG consists of a “Perl-like” language that allows for query execution over data stored on a Hadoop cluster, instead of a “SQL-like” language. PIG was developed by Yahoo. and, just like Hive, has also been made fully open source.

WibiData is a combination of web analytics with Hadoop, being built on top of HBase, which is itself a database layer on top of Hadoop. It allows web sites to better explore and work with their user data, enabling real-time responses to user behavior, such as serving personalized content, recommendations and decisions.

Perhaps the greatest limitation of Hadoop is that it is a very low-level implementation of MapReduce, requiring extensive developer knowledge to operate. Between preparing, testing and running jobs, a full cycle can take hours, eliminating the interactivity that users enjoyed with conventional databases. PLATFORA is a platform that turns user’s queries into Hadoop jobs automatically, thus creating an abstraction layer that anyone can exploit to simplify and organize datasets stored in Hadoop.

As the data volumes grow, so does the need for efficient and effective storage techniques. The main evolutions in this space are related to data compression and storage virtualization.

SkyTree is a high-performance machine learning and data analytics platform focused specifically on handling Big Data. Machine learning, in turn, is an essential part of Big Data, since the massive data volumes make manual exploration, or even conventional automated exploration methods unfeasible or too expensive.

Big Data in the cloud

As we can see, from Dr. Kaur’s roundup above, most, if not all, of these technologies are closely associated with the cloud. Most cloud vendors are already offering hosted Hadoop clusters that can be scaled on demand according to their user’s needs. Also, many of the products and platforms mentioned are either entirely cloud-based or have cloud versions themselves.

Big Data and cloud computing go hand-in-hand. Cloud computing enables companies of all sizes to get more value from their data than ever before, by enabling blazing-fast analytics at a fraction of previous costs. This, in turn drives companies to acquire and store even more data, creating more need for processing power and driving a virtuous circle.

About Thoran Rodrigues

After working for a database company for 8 years, Thoran Rodrigues took the opportunity to open a cloud services company. For two years his company has been providing services for several of the largest e-commerce companies in Brazil, and over this t.

Full Bio

After working for a database company for 8 years, Thoran Rodrigues took the opportunity to open a cloud services company. For two years his company has been providing services for several of the largest e-commerce companies in Brazil, and over this time he had the opportunity to work on large scale projects ranging from data retrieval to high-availability critical services.


Children – s Hospital of Philadelphia, data center philadelphia.#Data #center #philadelphia


#

Children’s Hospital of Philadelphia

Utility menu mobile navigation

Breakthroughs.

Let’s get started.

Inside Children s Hospital

At Children’s Hospital of Philadelphia, we’re always learning, growing and exploring in our effort to deliver the highest quality, compassionate, family-centered care. Here’s what’s happening now.

For me, her scars are beautiful.

Data center philadelphia

FDA Approves Personalized Cellular Therapy for Advanced Leukemia

The U.S. FDA today approved a personalized cellular therapy for advanced pediatric leukemia.

Data center philadelphia

Surgeons Separate Twin Girls Joined at Heads

On June 6, 2017, surgeons successfully completed the separation of 10-month-old conjoined twins Erin and Abby Delaney.

Data center philadelphia

Join Us!

Lace up your sneakers, grab some ice cream or put on your dancing shoes our upcoming events will keep you motivated and support a great cause!

Data center philadelphia

CHOP Among U.S. News & World Report’s Top Children’s Hospitals

Children s Hospital of Philadelphia is among the nation s best children s hospitals in the 2017-18 U.S. News World Report Honor Roll of Best Children s Hospitals.

Data center philadelphia

How Cancer Immunotherapy Works

This is a child-friendly explanation of how cancer immunotherapy, or T-cell therapy, works.

You’re Invited!

Join us on Oct. 1 for the Newborn and Infant Chronic Lung Disease Program Family Reunion a fun day celebrating our patients and families!

Defeating the Odds

A pioneering treatment at CHOP gives Austin a second chance at life.

A Path to a Cure

Tristan s early-onset scoliosis was treated with serial body casting, an innovative therapy that gradually straightens spinal curves.

Why Sleep Schedules Matter

As the summer ends and school and daycare schedules resume, it s important to shift back to a healthy sleep routine.


MAXQDA: Qualitative Data Analysis Software #mac #data #analysis


#

Software for Qualitative, Quantitative and Mixed Methods Research

Every two months we will let you know you about new products, updates, new features, promotions, trainings and upcoming events.

I spent several months researching the options, and ultimately decided to trial MAXQDA. We brought in a MAXQDA certified trainer, and bought a network license so that our large team at Microsoft could use the tool. We were not disappointed. [. ] I was so convinced in its efficacy in the applied qualitative field that I bought MAXQDA for my team when I joined Amazon. I was especially delighted when they recently added the Stats package, which allows us to avoid the extra expense of buying SPSS. I use MAXQDA almost daily, and am the first to recommend it to applied research teams.

Sam Ladner. PhD, Senior UX Researcher, Amazon

I really enjoyed working with the brand-new version, (MAXQDA Analytics Pro). I was waiting for this qualitative-quantitative software which helps me manage even quantitative content analysis without using SPSS. I can manage now the whole research process inside the MAXQDA and it’s a great experience!

László Galántai. University of Pécs, Institute for Education, Pécs, Hungary

Easy to learn and very efficient to use. Thanks to the elegant and straightforward design of MAXQDA, you can easily translate flashes of insights into descriptive and theoretical codes, analytical models in the form of hierarchical code system, and various types of memos, without being bothered by complicated technicalities.

Ikuya Sato. Professor of Sociology and Organizational Science, Hitotsubashi University, Graduate School of Commerce and Management, Japan, Tokyo

I have used MAXQDA as my primary recommended qualitative data analysis software program since 2005, when I began facilitating Spanish-language projects and MAXQDA was the only program with full documentation in Spanish. Since that time, I have taught MAXQDA in a wide variety of settings including education, government, and non-profit organizations.

Karen Andes. Ph.D. Assistant Professor, Hubert Department of Global Health – Rollins School of Public Health, Emory University, USA

What I like about MAXQDA is its ease of use (a low learning curve), the depth and breadth of its classifiaction and analysis functions, and the possibiltiy of conducting different kinds of qualitative or texts analysis simultaneously.

Warren A. Hodge. Ph.D. University of North Florida, Department of Counseling and Educational Leadership, USA

The ability to easily import string and numeric selection variables provides an invaluable tool for relating complementary quantitative and qualitative research strategies. This and other features have made MAXQDA particularly well-suited for integrating ethnographic and epidemiological methods for a cultural epidemiology of various health problems.

Mitchell Weiss. MD, PhD, Professor and Head, Department of Public Health and Epidemiology, Swiss Tropical Institute, Basel, Switzerland