Web Development Archives | i4 Consulting Pvt. Ltd

Should IPv6 be “compatible” with IPv4?

Should IPv6 be “compatible” with IPv4?




IPv6 deployment continues, but at a much slower pace than expected. At this speed, we will still have IPv4 for the next 20 or 30 years. Faced with this exasperating slowness, we often hear from experts assert that it is the consequence of a fundamental error at the beginning of the design of the IPv6 protocol: it should have been made “compatible with IPv4”. What does that mean, and would it have been possible?

To understand how, let’s first see how IPv6 is incompatible with IPv4. This particularly concerns routers and applications. Indeed, the format of the header of the packetsIPv6 is very different from that of the IPv4 packet header. A router knowing that IPv4 cannot do anything about an IPv6 packet, it cannot even analyze it. IPv6 therefore required an update of all routers (today largely made, even on the low end) and the applications? Normally, many applications do not need to know the details of the network layer. After all, it is one of the goals of the layered model to isolate applications from the particularities of the network. But there are exceptions (server application with ACLs and therefore having to manipulate IP addresses, for example), and, most importantly, many applications are not written with a high-level API: the programmers use, for example, the socket API, which exposes lots of unnecessary details, such as the size of the IP addresses, thus binding the application to a particular network protocol. IPv6 imposes to update a lot of applications, which has been done for a long time for large free software known (Apache, Unbound, Postfix, etc) but not necessarily for small local software developed by the ESN corner.

Could we get away with designing IPv6 differently? Basically NO, to see why, we must start from the specifications of IPv6: the main problem was the exhaustion of the IPv4 addresses. It needed longer addresses (they are 128 bits for IPv6 against 32 for IPv4). Even if this had been the only change in the format of the packet header, it would have been enough to make it incompatible, and thus to force to change the routers, as well as applications dependent on IPv4. Regret that the IETF has changed other aspects of the header, which could have been left quiet, does not make sense: just the change of address size invalidates all the IPv4 code. This would not be the case if IP packet headers were encoded as TLVs or in another format with variable size fields. But, for performance reasons (a router may have to handle hundreds of millions of packets per second), IP packets have a binary encoding, with fixed-size fields. Any modification of the size of one of these fields therefore requires changing all the packet processing code, all the ASICs of the routers.

Even in the absence of this “on cable” encoding problem, it is not certain that all existing programs would support address size change. How many older applications take for granted that IP addresses are only 32 bits in size and, if written in C, put them in int(usually 32 bits)?

Nevertheless, despite these long-known facts, we often come across claims such as “the IETF should just add bits to the addresses but without changing the format.” As we have seen, any change in address size changes the format. And, if we do not change the size of the addresses, why use a new protocol?

This does not mean that IPv4 and IPv6 need to be unable to talk to each other, like “ships crossing each other at night”. It may be thought that an address translation solution would allow at least some exchange but careful not to simply copy the NAT IPv4. IPv4 uses the ports of TCP and UDP to identify a particular session and to know where to send packets. There are only 16 bits to store the ports, and this would not be enough to allow to represent all the IPv6 addresses in IPv4 addresses (it would still lack 80 bits to find …) There are many solutions with translation of addresses, as NAT64 ( RFC 6146 ) but they can only be used in limited cases (for NAT64, between a purely IPv6 client and a purely IPv4 server), and lead to additional dependencies (for NAT64, the need to have a resolver Special DNS , see RFC 6147 ). In short, it does not exist and there can be no mechanism for complete compatibility between a protocol that uses 32-bit addresses and a protocol that uses 128-bit addresses. There are partial solutions (the simplest, we often forget, is to have an application relay), but no complete solution.

Of course, that’s assuming we want to keep compatibility with older machines and software. If we start from scratch, we could make a layer 3 protocol with variable size addresses, but it would no longer be IP and such a protocol would be even more difficult and expensive to deploy than a new version of IP, like IPv6.

Is it just me who does not see a solution, or is it really a problem of substance? So far, a lot of people have been moaning “it should have been IPv4-compatible IPv6” but I have not yet seen any detailed proposal on how to do that. There are plenty of ideas behind the envelope, these ideas scribbled in two minutes but will never go further. Writing a tweet is one thing. To specify, even partially, a protocol is something else. We see, for example, someone coming out of their field of competence (cryptography) write “they have the IPv6 address space as an alternative to the IPv4 address space, rather than an extension to the IPv4 address space  “). But he did not go further. At least, the author of the ridiculous project called IPv10 had made the effort to detail a little its proposal (the same author had committed a project to connect the satellites by optical fibers). It is also the fact that his proposal is relatively detailed so that we can see that it does not hold up: the format of the packets (the only thing it specifies a little precise) being different, its deployment would be at least as slow as that of IPv6. The cryptographer mentioned above did not even trouble himself with this.

API (Application Programming Interface)

API (Application Programming Interface)




What is behind the term API?

API is the abbreviation for “Application Programming Interface“. An API is therefore an interface: It connects software and hardware components, such as applications, hard disks or user interfaces.

When programming, APIs unify the data passing between program parts, such as modules, and programs.

Complex programs no longer work without APIs. In this case, individual program parts, so-called modules, are encapsulated by the actual code. The modules communicate exclusively via the API. Simply put, APIs translate the software into machine language, making it readable for various components.

Why do you need APIs?

Thanks to program modules and associated APIs, complex programs are easier to maintain. An API tests by means of transferred data whether the module is working properly. This makes it easier to locate and fix errors. In addition, APIs act as data providers in the software domain: Content can be exchanged between various websites and programs using APIs.

The word API (English for “application programming interface”) means translated “programming interface” and forms for programmers accordingly an access point to software. The API is released by the owners of a software and allows external programmers limited access to the software. The API thus provides a bridge between providers and the users (programmers).

This is what the API offers its users

Unlike direct access to the source code of programs, APIs offer their owners and external programmers:

    • A manageable access point to parts of the company’s software system for the programmers.
    • A way for publishers (owners) to hide certain parts of their software from the programmers.
    • A communication between programming and software, where the API transmits the inputs of the programmer and returns their response to the programmer.
    • A network of countless programs, which so-called code reuse (reuse of programming code), the benefits of this code immensely. Thus, countless apps are created.
    • For example, APIs are very important for web services. Through the shared interface programmers are able to insert existing content into the program. Further processing of content is possible.
    • The term API comes from the English language area and is the short form of “Application-Programming-Interface”. Freely translated into German, this means something like “interface to application programming“. Colloquially, however, API is usually referred to as a programming interface.
    • This interface provides other programs with a tool to connect to the software system. This allows developers to influence the hardware, such as the monitor or the data on the hard disk, without having to address them directly. The operating system serves as the interface, which receives requests from the programs via the provided libraries and forwards them to the hardware.
    • APIs and digital content
    • However, the term “API” has become relevant mainly through its use of web services. These allow developers to use the interfaces provided to dynamically integrate content provided into their own program. APIs thus serve to exchange and further process data and content between different websites, programs and content providers. In addition, they allow third parties access to previously closed data pools and user groups.
    • In a more technical logic, APIs are like the machine equivalent to the user interface, which has been optimized for people and thus is “human readable”. The API is a software-specific interface, i.e. machine-readable. The application programming interface provides a clear abstracted and structured access to the functions of the backend. In addition, data can be exchanged, for example, in a particularly easy-to-process and reduced form.
    • There are four different types of Web APIs:
    • • Internal APIs
    • • External APIs
    • • Platform APIs
    • • Authentication APIs and Authorization APIs

 

  • Example

 

  • Through the YouTube API, developers have the ability to search for videos with desired parameters, such as name or length. The API returns the answer in the form of an XML file. This can then be used after an evaluation for your own website.

 

 

Top 5 frameworks for developing a E-commerce application in 2018

Top 5 frameworks for developing a E-commerce application in 2018




From small startups to big brands, Almost everyone is selling products online using an e-commerce platform and the reasons are  obvious,  such as instant  returns to more  reachability,   home delivery to digital management system all of these  features  makes   the   business   more  efficient  and  make the process of selling their products seamless   and  it  can  also  be  automated  through  e-commerce  platforms.

top 5 framework for e-commerce

More and more people nowadays prefers shopping online specially the youth and the new generation. According to an estimate it is evaluated that the e-commerce market will cross 4.88 trillion dollar revenue by the 2020 which is now only 2.3 billion us dollar in 2017.

Except few glitches ,major economies are rolling out of slow economic growth and recession of 2008 .So, It appears like there would be no factor or reason which could possibly affect the growth of ecommerce in near future.

If you are thinking of launching your own ecommerce platform, then it is a right time to jump on the train .

Here is a list of features why you should choose an e-commerce platform over traditional ways :
It is more convenient than traditional way – It allow users to buy product anytime unlike traditional way which is time bounded.
Broadened reach – you can broaden your reach to other countries. Which will consume so much resources when done via traditional ways.
Spent less on marketing – marketing digitally online is much cheaper than traditional marketing

Here is the list of Top ecommerce platform used to develop ecommerce website

  • 1. Magento community edition
  • 2. WordPress woocommerce
  • 3. Opencart
  • 4. Os commerce

1. Magento community edition

Magento was released in the year 2008. It is an open source platform which made it one of the most popular frameworks for e-commerce. Many big companies uses magento for their online ecommerce platform like nike, burger king, huawei etc .
Pros:-
It is an open source framework so the community support is available whenever you get in trouble with the development. The platform is of high usability and low maintenance. More than 9000 add-on’s to scale up your application. Large community and resources available for the framework
Cons:-
Basic version is free but to access all the features you have to opt for their premium version which is costly. User would need good knowledge of programming to work with magento .

2. WordPress woocommerce

WordPress is one of the most popular content management system on the web. woo commerce extends its power even more, Woocommerce is a wordpress plugin which is specifically made for e-commerce operations .

Pros :-
It is pretty easy to use and it can also incorporate unlimited products and category. The framework has Options to add shopping cart. It also provide users with secure gateways for online payment
Cons:-
Need additional money to integrate woocommerce with shopping cart completely. The add-ons are premium that means you have to buy them to use them. Not suitable for scalable platforms

3. Opencart

Opencart is open source project aims to provide a easy alternative to other solutions
Pros:-
It is an open source framework so there is no need to buy any license to install the opencart software. All add-ons are free use without any restrictions. The framework is pretty easy to use and use extremely low resources – it means it does not suck resources like magento.
Cons:-
It allows less features as compared to magento. Sometime the user will have to use additional plugins to boost the performance. There are not much customization options available for the framework.

4. OScommerce

Oscommerce is older than most of the others mentioned in this list. which give it an edge as compared to its near competitors.

Pros:-
More than 30000 online stores are made using oscommerce so that is its credibility. It is much like wordpress. It also have a strong community support and comes with huge collection of educational resource, it mean you won’t be stuck anywhere.

Cons:-
It is a premium product means it is not free, if you want to install it you have to pay for it. It is also not suited for larger ecommerce websites because this framework is not good with scalable businesses

5 Shopify

Shopify is one of the most known platform in the list it has already 600,000 merchant using it in 2017. It is a premium platform with a lot of features.

Pros:-

There are more than 1000 professional themes available for you to choose from when you open your store. The shopify takes out the pain of testing a platform because everything from theme to every plugin is especially tailored for shopify.

Cons:-

It is a premium platform so you need to pay to use it. It charges a transaction fees on every sale unless you use shopify payment.

Django (Python) vs Laravel (PHP)

Django (Python) vs Laravel (PHP)




Here we will try to conclude which is better among Django and Laravel (Python vs PHP).

Django and Laravel both were made with keeping a single concept in mind of taking an idea and launching it as fast as possible and both are good in their respective way. But both(django and laravel) also lack in some areas like Laravel lacks in the developer support and has a larger learning curve compare to the Django and Django is also not perfect since it lacks in the API department. In Django, you would not be able to create API’s you would have to use Django_REST framework for that job and using two frameworks is impractical.

One of the most concerned issue of the customer is of security where Django take the lead and it helps developers to avoid the common mistakes that usually give us problem in web app development for example SQL injection, cross-site scripting, and cross-site request forgery.

Laravel also has some of these features but falls behind when compared to Django. The Laravel gets one-up in the tutorial department because Laravel has a library of 1100 videos with documentation. Django leads in the speed department because it uses python which is one of the fastest languages in the industry for development although the execution time for python is slightly high because some of the included functions are built on other languages like c and c++.

Laravel and Django both are open source and have an extensive libraries to speed up the process of development. Laravel has libraries that make the networking much easier when compare to the Django framework. Django provides a GUI interface for admin which would help you to automatically build a site area and create, view and modify or delete records and it is not available in Laravel.

Django also provides some built-in decorators, like login required, require_POST or has_permission which makes the life of a developer a little bit easier. Django provides base view classes for a wide range of applications. It has caching system for saving the dynamic pages. It also has an extensible authentication system. Since the release of the Laravel 3 the “addition to application” feature has become easier due to the framework bundle packaging system and dependencies. Laravel can make web application development easier especially in the routing

You can make project of any magnitude and test it easily through unit testing and the issues of scalability, security and responsiveness has begun to resolve since the release of Laravel 5. Django's sheer scale and functionality can feel clunky and bloated for small applications. It has too many bells and whistles which can get in the way when developing a small scale application.

Conclusion:

The debate of Django vs Laravel is actually a debate of python vs. php. So if you need mature software with an extensive library support, easy development, have security at number one priority and need large scability then Django is for you but for small scale development, routing centric applications or web apps where you need to start from scratch and you have experienced php coders then Laravel is perfect for you.

Internet of Things (IOT)

Internet of Things (IOT)




With the dawn of the 20thcentury, a new idea sparked in the tech world to connect every single day to day appliances to the internet, to make them communicate, share data with each other and by using that data for making our life easier but as good as it sounds in theory there are many challenges in making it work the two biggest among them being the technology gap between the different appliances and security. Now we are reaching the time where most of our devices are getting connected to the internet for example: smart TV’s, fridges, cars etc. The problem of technology gap is on the verge of solving but the second and the important issue being the security is still a major issue because in networking any type of connection is vulnerable to a variety of attacks major one being man in the middle(MITM) and wireless communication is the one that is getting used most in the Internet of Things (IOT) which is more vulnerable to snooping. The number of devices in the individual network in Internet of Things (IOT) is so high which also doesn’t help in security due to different technologies and protocols being used in different devices. This also creates compatibility issues but that is a minor one comparing to the others. Solving all of the above mentioned issues gives us a new technology that can give a push to the automation industry.

Imagine walking out of the house without even thinking about turning off the gas, lights etc. Imagine getting alerts from the fridge about the items that are low on inventory whenever you are in the supermarket or getting your car out of the parking and in front of the house whenever you are going to the office. All of this is a minor part of implementing Internet of Things (IOT). This technology can revolutionize the way we will accomplish our everyday tasks, it can save a tremendous amount of time and will improve the way we will live. It will affect every part of our life.

That is all in the future (actually near future) but in the present we are also implementing some aspects of IOT for instance wearable tech and smart homes are becoming a major hit in IOT. We are tracking our Heart rate, Steps count, Blood pressure using the smart wearables and using the collected information to improve our health. The smart homes are automating all themenial repeating tasks like flushing, turning off the appliances, ordering everyday items and so much more. This technology will get better with use if used with the machine learning and artificial intelligence to predict our actions and making the environment susceptible to those actions and changes.

The IOT systems are combinations of hardware and software. Most of the hardware comprises of the sensors and the software platforms that are mainly used for the development and coding of IOT devices are ARM mbed and Bosch IOT suite. ARM mbed is a tool that gives you a platform to write software on the hardware and gives you facility to connect that device to the cloud through which we can implement IOT. The ARM also gives you inbuilt security features like the devices secure itself from untrusted and malicious code. The Bosch IOT suite also gives you the same functions as the ARM with some extra inclusions like use of open standard and open source code and is mainly used to develop cross platform systems. Evothings studio and Netvibes are the other major platforms that are used to create Internet of Things (IOT) systems.

But all these accessibility comes at a price which is the danger of breach of privacy as the more we get into the web of Internet of Things (IOT) the more information about ourselves we put on the network so if a privacy breach occur it will be more drastic than your run of the mill privacy breach cause of the mass amount of our information that will get exposed like our everyday patterns our routines our financial credentials and so on.

So, the question arises that whether this is some game changing technology or a simple hype box. According to my opinion if this technology gets matured and applied correctly it will change the way we live but if left with any security loopholes this can turned into the biggest information collecting machine for the bad guys.