LATEST FROM OUR BLOG

Take your daily dose of (only) relevant news, useful tips and tricks and valuable how to's on using the latest web technologies shaping the digital landscape. We're here to do all the necessary information sifting for you, so you don't have to, to provide you with content that will help you anticipate the emerging trends about to influence the web.

How to Integrate Alexa with Your Drupal 8 Website: A Step-by-Step Guide
Just imagine: a user asks Amazon Alexa to read out loud to him/her the headline of your latest blog post! Or maybe to look for a specific section on your Drupal site! Or, even better: quit imagining this and start implementing it instead! Right on your website. And here's how you integrate Alexa with your Drupal 8 website via the Alexa integration APIs. A 7-step tutorial:   on how to get Alexa to “talk to” your site users/online customers on turning your site's content into the needed “raw material” for setting up your custom Alexa skills on how you can leverage Drupal 8's outstanding third-party integration capabilities to “fuel” your implementation plan with   So, here's how it's done:    But Why Precisely Amazon Alexa over Other Voice Assistants? Because Amazon Alexa stands out with its top notch integration capabilities. Its integration APIs make it possible for this particular voice service to be “plugged into” various devices and web services. As simple as that! Alexa's more than just a voice assistant making voice (obviously!) interaction possible: It's a voice service that empowers you to integrate it even with your Drupal 8 website quickly and smoothly, via its own built-in APIs!   Introducing Alexa: The Drupal Module for Amazon Alexa Integration With Alexa “doing its own part” and the voice service bringing its Alexa integration APIs into the equation, it was only fair that the Drupal community should play their own part, as well. The result of their sustained efforts? The Alexa Drupal module:   which provides an endpoint for your Drupal 8 website, where it would receive the vocal user requests “stored” in the Alexa Skills "user requests” which get converted into text strings before getting sent over to the Alexa module on your Drupal site   Note: do keep in mind that the Alexa module is still under development, but with a more than promising, long-term future ahead of it. For now, it offers basic integration with Amazon's Alexa. And this is precisely why you'll need to build a custom module, as well, to integrate Alexa with your Drupal 8 website. But more details on this, in the tutorial here below:   Integrate Alexa With Your Drupal 8 Website: A 7-Step Guide    Step 1: Make Sure Your Site Uses HTTPs In other words: make sure your Drupal 8 website's “easily detectable” by Amazon's servers! The very first step to take will be to switch your site over to an HTTPs domain (a step you can skip if your site's already on HTTPs)   Step 2: Install the Alexa Module Go “grab” the Alexa Drupal module and get it installed and enabled on your website.    Step 3: Set Up Your Alexa Skill  With your dedicated Drupal module ON, it's time to focus on all the needed setting up to be done on the Amazon Developer site. And the very first step to take is to create your own new Alexa Skill in the Skills Kit there. Step 4: Copy & Paste Your Application ID And this is no more than a quick 2-step process:   first, you copy the Application ID provided in your “Skill information” section, on the Amazon developer site then you submit it to your website's configuration at /admin/config/services/alexa   Step 5: Configure Your New Alexa Skill A key 3-part step to take when you integrate Alexa with your Drupal 8 website, where you:   give a name to the Alexa skill (in the Alexa app) to be triggered set up an Invocation Name for your users to utter for “activating” your newly created Alexa skill set up the custom vocal commands or “intents” that Alexa will need to respond to   For this, you'll need to go to the Amazon Development website again and access the “Skill Information” section. Note: maximize the odds that it's precisely those intents that your users will utter by adding more phrasings of the very same question/vocal command.  Another note: this flexibility proves that you get to harness the power of... variables when setting up your custom intents. “Variables” that you'll use with the custom module that you're going to build at the following step of the process:   Step 6: Create a Custom Module for Triggering The Right Responses to Your Intents What should happen when your custom intents get invoked and sent through to your Drupal 8 website?  You'll need to create a custom Drupal 8 module that would handle responses. For this, insert the following info in the demo_alexa.info.yml file: name: Alexa Latest Articles Demo type: module description: Demonstrates an integration to Amazon Echo. core: 8.x package: Alexa dependencies: - alexa Note: Do keep in mind to add the Alexa Drupal module as a dependency! Now, time to build the custom module itself:    create a file at src/EventSubscriber/ name it RequestSubscriber.php    As for the code that will “populate” your module, first of all it's the namespace and use statements that you'll need to create: namespace Drupal\demo_alexa\EventSubscriber; use Drupal\alexa\AlexaEvent; use Symfony\Component\EventDispatcher\EventSubscriberInterface; use Drupal\paragraphs\Entity\Paragraph; Then, you'll need to set up your main class, as well as a function to trigger the event: /** * An event subscriber for Alexa request events. */ class RequestSubscriber implements EventSubscriberInterface { /** * Gets the event. */ public static function getSubscribedEvents() { $events['alexaevent.request'][] = ['onRequest', 0]; return $events; } Next, set up the function “responsible” for giving responses to each one of your custom intents.  With the code for your responses at hand, the very last file that you'll need to focus on is the demo_alexa.services.yml: services: alexa_demo.request_subscriber: class: Drupal\demo_alexa\EventSubscriber\RequestSubscriber tags: - { name: event_subscriber } Note: Remember to enable your demo Alexa module, then to navigate to the Amazon Developer site once again!   Step 7: Test Out Your New Alexa Skill  Another essential step to take when you integrate Alexa with your Drupal 8 website is testing your newly created Alexa skill.  And there's even a Test tab on https://developer.amazon.com for that! Click on this specific tab, ensure that your new Alexa skill is enabled and thus ready to be tested and... see whether you'll get the right responses! The END! This is the “how it's made” for getting Amazon Alexa to “talk to” your Drupal 8 website via:   the Alexa integration APIS the Alexa module a custom-built Drupal 8 module   ... Read more
RADU SIMILEANU / May 18'2018
When Should You Not Consider Using Node.js? 3 Unsuitable Use Cases
It's undebatable: Node.js has practically laid the foundation of the real-time web! The real-time, two-way connection web apps have revolutionized the old web response paradigm. The one where it was just the client who could initiate communication, never the server, as well. Even so, there are certain cases when using Node.js is not the best decision you could make. Specific use cases for which the otherwise flexible and revolutionary web server technology turns out not to be... unsuitable. So: “When shouldn't I use Node.js?” You might legitimately ask yourself. Here are the 3 bad use cases for this JavaScript runtime environment. Scan them through, take note all the factors that I'll be outlining and think them through before rushing to use Node.js to power your next project with.   1. A CRUD-Heavy Application: Using Node.js Is Simply a Bad Idea Face it, deal with it and... adjust your decisions to it: There are plenty of better solutions (other than Node.js) for powering your CPU-intensive app. It's just not the best technology at hand when it comes to heavy computation. Now here's why, by using Node.js, you'll only end up “sabotaging” its very advantages, instead of turning it into a true “horsepower” for your app, as you might expect:   Node.js leverages an event-based, non-blocking I/O model, using a single CPU  hence, all that intense CPU-processing activity will actually block the incoming requests … since the thread will get “held up” with number-crunching   The direct effect of “exploiting” Node.js in the context of heavy server-side computation?  The very benefits of its event-driven, non-clocking I/O model would get practically... nullified in the context of CPU-intensive operations. Given this, why would you stubbornly stick to Node.js, when there are other technologies more suitable for building your CPU-heavy software with? With better results?   2. A Simple CRUD (or HTML) Application No need to get your hopes high when using Node.js with a basic CRUD or HTML application: It might turn out to be just “slightly” more scalable, yet don't expect a traffic flood just because it's Node.js-powered. In short: use cases like this one, where data's provided, straightforwardly, by the server and where there's no need for a separate API, render Node.js superfluous. There are other frameworks suited specifically for this type of projects (take Ruby for instance). Using Node.js in this case would be like driving a Formula 1 car while... stuck in rush hour traffic.    3. A Relational Database-Backed Server-Side App Why isn't Node.js your best choice for a relational data access type of project? Because its relational database tools aren't as reliable, robust and easy to work with as other frameworks' toolboxes (take Rails for instance!). Rails, for example, would “spoil” you with:   already matured Data Mapper and Active Record data access layer implementations out-of-the-box data access setup DB schema migrations support tools … and the list goes on   In short: if there already are frameworks perfectly “equipped” for this type of project “scenarios”, why would you stick to using Node.js? Since its relational DB toolbox is not (yet) as advanced?   In Other Words... With these 3 “bad” use cases for Node.js “exposed”, allow me to put together a short “inventory” here, one including all the “gotchas”, aspects to consider before kicking off your Node.js project and limitations to be aware of:   Node.js hasn't been built with the “solving the compute scaling” issue in mind   … but it has been created to solve the I/O scaling issue instead   excepting contexts of CPU-heavy operations, Node.js still is the best technology at hand for powering your real-time, scalable web apps with   do reconsider your decision of using Node.js if for developing your piece of software you depend on some kind of threading model   there are also poor quality packages available in npm, free to use in your Node.js application; do keep this in mind as you dig deep into the “load” of  Node.js packages   Node.js will never be “the best choice” for event loop-blocking use cases (take asynchronous parsing XML, for instance)   … nor for powering apps relying on intense computation   Node'js “worker” is geared at solving HTTP server calling issues (rather than intense computing issues)   The END! ... Read more
RADU SIMILEANU / May 17'2018
What Is Node.js Used for? What Projects Can You Build Using It? 7 Best Use Cases
Not exactly the “jumping on the latest trend" type? Therefore, you're still a bit hesitant to get on the Node.js bandwagon? And this because you still haven't got some crystal-clear answers to your “What is Node.js used for?” question? You're legitimately hesitant then! For everyone's gone crazy over it these days, but you know that there must be certain factors to consider. Specific use cases for Node.js, as well as cases when... well... it just isn't the best idea. You're well aware that there are some particular applications that call precisely for this JavaScript runtime environment. And you need to be 101% sure that your project fits the “profile”. One of the “best use cases of Node.js”. But let's not meander any longer and get you some clear answers to your questions instead:   Why should you even consider Node.js one of the possible technologies to choose from for your next project?   Which are the ideal use cases for Node.js?    1. Why Would You (Even) Consider Node.js for Your Project?  Why should Node.js be on your list of... options, in the first place? On your shortlist of technologies that might power your next project? There must be some “bundles of convenience”, some major benefits that you can “reap” from using it, right? Benefits that make it worth building your project using this specific environment. Well, let us shed some light on these clear advantages of developing your application in Node.js:   it's a Google JavaScript engine (a runtime environment) which, translated into clear benefits, means that: it's fast and scalable web apps that you'll build using it   speaking of its scalability: Node.js's built to scale on individual process basis and to leverage multi-core processing on modern servers (via its Cluster module)   it's JavaScript-based... so the “pool” of developers with some kind of JS knowledge/expertise is conveniently large: 99,9% of them know at least “some” JavaScript   … no to say that this turns Node.js into the perfect choice if there are beginner developers in your team (even junior developers are at least familiarized with JS)   any developer will be able to gain a quick understanding of your Node.js app's codebase   it speeds up developers' work with a collection of modules (Grunt, NPM, etc.)   it provides your development team with a great package manager, npm, with a widely available and increasingly heavy “load” of open-source tools    it's backed and actively maintained by an ever-growing community ready to... support you; the knowledge base that your development team needs to get a grip on Node.js is accessible and... free   it's open-source: you'll benefit from a single, free codebase    it's just... fast, there's no point in debating over this: the event loop and Google's innovative technologies are “turbocharging” it   it cuts down costs, as simple as that: Node.js enables your team to use the same language on the front-end and on the back-end, which translates into boosted efficiency, cross-functionality and implicitly... reduced costs   you're “tempted” with a whole range of hosting options to select from   it supports native JSON: it's in this specific format that you'll get to keep your data stored in your database   Now if I was to trim this list to just 3 answers to your “what is Node.js used for?” dilemma, it's these 3 key benefits that I'd stubbornly stick to:   performance: Node.js is simply... fast, faster than other JS languages; moreover, as a runtime language it has enhanced JavaScript with new capabilities versatility: from backend to front-end apps, to clips to... pretty much everything in between, Node.js enables you to build any kind of project that you have in mind; as long as it's written in JavaScript, of course agility: regardless of your/your team's level of JavaScript expertise, Node.js empowers you to kick-start your project, to get it up and running in no time; it's developer productivity-oriented (just think same language for both back-end and front-end!!!), with a low learning curve    2. What is Node.js Used for? 7 Great Use Cases Now back to the very question that started this post: “What exactly is Node.js good/used for?” There are specific app projects that this server-side JavaScript technology makes the best choice for: 2.1. Chat servers And generally speaking any type of fast-upload system challenged to respond, in real-time, to an “avalanche” of incoming requests. 2.2. Real-time apps  This is the use case that Node.js “rocks at”. Where you get to make the most of its capabilities. Apps geared at processing high volumes of short messages, where low latency becomes critical, make the best possible answer to your “what is Node.js used for?” question. Here's why:   it enables sharing and reusing Node.js packages that store library code it processes ideally fast: quick data sync between the client and server it's perfectly “equipped” to cope with multiple client requests   In short: if scalability and real-time data processing are 2 critical factors to consider when choosing the best technology for your project, think Node.js! It's built to suit specifically those situations that are “overly demanding” of our servers. 2.3. Video conference apps  ...applications using VoIP or specific hardware.  Projects involving intense data streaming — audio and video files — make the best use cases for Node.js. 2.4. Instant-messaging, live-chat apps  2.5. Highly scalable apps Think Trello or Uber-alike apps, that depend on a server-side server technology enabling them to scale out on multi-CPU servers. Node.js, thanks to its cluster-based architecture, will always make the best choice for apps depending on a technology that would spread out the load across a multi-core server. Note: speaking of scalability requirements, should I also mention that Node.js is... conveniently lightweight, too? 2.6. eCommerce transaction software and online gaming apps “What is Node.js used for?”  For powering apps for which online data is of critical importance. Like these 2 examples here! 2.7. Server-side applications  Being an event-driven model of programming, the flow is determined by messages, user actions, and other specific events of this kind.   3. Afterword Does this clear the picture for you a bit more?  As a conclusion or “final” answer to your “what is Node.js used for?” type of dilemma, the key takeaway from this post here is that: Node.js is used primarily for web applications, but it's starting to get used more and more often for developing enterprise apps, too, thanks to its versatility. What does the future have in store for this increasingly (still) popular technology:   rising potential for Node.js to be used for building IoT solutions …. for “experimenting” with enterprise data more and more big names (adding to Netflix, IBM, Amazon, Uber, LinkedIn, etc.) choosing it over legacy languages such as Java or PHP   ... Read more
RADU SIMILEANU / May 10'2018
Reservoir or Decoupling Drupal Made Easy for Anyone: Non-Drupal Developers and Editors
Here's how the ideal decoupling Drupal scenario looks like: Stripping Drupal to its essential role, that of a robust and flexible content repository, no Drupal expertise needed. Then using it to back your front-end with; one that you'd be free to build by leveraging any modern (JavaScript) technology of your choice. … a Drupal back-end content store that would still preserve all its content editing and managing functionalities, needless to add. Luckily, this is no longer “daydreaming”. Not since Reservoir, the headless Drupal distribution, has been available.  Here are some of its “promises” or well-known challenges, if you prefer, that this distribution's geared at solving:   to make Drupal far more accessible (cutting the intimidating Drupal setting up and configuration out of the equation) to developers of all stripes   to empower developers with all the best practices for building their Drupal-backed front-ends quick and easy   to provide an opinionated starting point enabling any developer to build a Drupal content repository backing his non-Drupal application with... no Drupal knowledge needed, actually   Your Current Situation: Why Would You (Even) Consider “Headless” Drupal? Here you are now, dealing with the pressure of:   having to deliver content agnostically across any given channel and device: single-page JS apps, mobile apps, digital signage, AR and VR-driven content, IoT apps etc...   … all while storing it (content) in one single place    providing your editorial team with a... way to edit, manage and overall administrate content conveniently easy, via an editor-friendly UI   … independently of the development team, of course   finding a way to enable your developers to easily send content across this entire “ecosystem” of channels, devices and platforms   In other words: you're grappling with the challenge of making Drupal ideally accessible to your (non-Drupal) developers; so they can easily build their Drupal-based content store enabling them to deliver content to any given device. … to serve it to any given app/site. And this definitely calls for a decoupling Drupal approach.   Decoupling Drupal: The Most Discouraging Challenges You Must Be Facing  Let's assume that you're already considering headless Drupal as a solution for your current challenge, that of delivering content to multiple channels, devices, platforms. Whether you're planning to decouple Drupal for:   building a Drupal-backed front-end, leveraging one of your modern JavaScript frameworks of choice or using it as a content store for your non-Drupal app   Then, it's these specific challenges that you must be facing right now:   your non-Drupal developers are having trouble maneuvering Drupal content; they're not familiar with all the dials and knobs needed for making the most of Drupal's REST API  Drupal's serialization format is... alien to them  there's no starting point or well-defined best practices for non-Drupalists, that would ease their way to turning Drupal into a content repository … one that they could back their front-ends with   True story! And still, there is hope...   5 Reasons For Being “Skeptical” About Distributions You must be legitimately cautious right now when it comes to using an API-first distribution for Drupal. And that's due to some bad experiences with... distributions. Now let me try and guess some of your “fears” regarding Reservoir:   that it might turn out to be overly complex  that you risk getting “stuck with” architectural debt that its maintainers might someday lose interest in it that it's built primarily for other use cases, for scenarios different from your own decoupled Drupal implementation project that you risk “inheriting” bugs in features that you haven't even used    And the list of reasons why you're not yet jumping on this decoupling Drupal trend could go on...   Introducing Reservoir: The Headless Drupal 8 Distribution! How Is It Different? Before putting it into the spotlight and giving it a “full scan”, let me try to read your mind and identify the questions that you must be asking yourself right now:   “How precisely do I use Reservoir as a content store backing my front-end website or app?”   “Which are the bare essential Drupal modules and core functionality that this distribution comes packed with?”   “How can I leverage these ready-to-use components for decoupling Drupal?”   And now that we've put your valid queries into words, let me try and define Reservoir for you:   1st definition: a distribution for decoupling Drupal   2nd definition: an ideally flexible and minimalist tool empowering developers of all backgrounds to build content repositories for their apps to “consume”   3rd definition: the headless Drupal 8 distribution “specialized” in manipulating content and interacting with it via HTTP APIs   4th definition: a Drupal-based content store with all the web service APIs backed into, so that any developer can jump straight to building his front-end app   5th definition: simply a... content repository; one that just happens to be Drupal-based, as the Reservoir project's maintainers admitted.   Now the 4 key goals behind this distribution for decoupling Drupal —  besides that of providing a simple way of building a content repository enabling you to use any technology for your front-end —  are:   on-boarding developers or all stripes, making Drupal ideally accessible to... anyone providing a much-needed opinionated starting point for any type of decoupled Drupal implementation; no Drupal knowledge required  keeping itself away from the scope creep that end-user facing products risk falling into serving a specific decoupled use case   Decoupling Drupal Made Easy & Accessible: Key Reservoir Features  “But how does Reservoir make building Drupal-based content repositories so much easier than other... distributions?”  “How precisely does it make Drupal accessible to non-Drupal developers, as well?” You're more than entitled to ask yourself that... Therefore, let me outline here the out-of-the-box Reservoir features geared at speeding up any decoupled Drupal implementation. Regardless of the developer's background:   an opinionated selection of API-first/ web services modules — Reservoir offers each developer a much-needed starting point/”push” so that he can ramp up and have his content stores built in no time: Simple OAuth modules here included   quick and easy access to the content back-end via JSON API    auto-generated documentation (API documentation), that gets automatically updated, as well, as you're browsing it, as your content model changes   OpenAPI format export, that supports hundreds of tools integrating with the OpenAPI specification    easy-boarding/tailored UI —  expect a “welcoming tour” once you've installed Reservoir, one focused on getting you familiar with modeling and managing content, web service APIs, mapping out new content models etc.   a permission system and content editing UI empowering your editorial team to easily manage content    SDKs, libraries and references —  included in the Waterwheel ecosystem —  so that your development team can skip the time-consuming API learning phase and jump straight to “attaching” Drupal back-end content to their front-end apps   Note: Reservoir, the distribution for decoupling Drupal, deliberately shakes off some of Drupal's functionality that's irrelevant for content repositories (modules such as Breakpoint, Views, Content, the user-facing front-end etc.) For we couldn't even talk about speeding up your decoupled Drupal project when there's an unnecessarily heavy weight of Drupal modules and features “dragging down” the whole implementation process, right?   Wrapping Up: What Reservoir Aims At Is... ... enabling your developers to jumpstart building self-hosted content repositories capable to serve any given front-ends. Front-ends that they get to build independently, tapping into the technologies they prefer, on a project-by-project basis. Pretty convenient, don't you agree?  ... Read more
Adrian Ababei / May 09'2018
How to Scale Your Node.js App: Best Strategies and Built-In Tools for Scalability 
Whether it's the increasingly challenging workload or you simply want to enhance your Node.js app's tolerance to failure and availability, there comes a time when you just need to scale it up, right? To “squeeze” the best performance out of your entire infrastructure of... nodes. Well then, here's how to scale your Node.js app: And scaling up your web back-end app at different levels —  overall improving its throughout — sure isn't an afterthought with Node.js: Scalability is built in the very core of the runtime. And the infrastructure of nodes, strategically distributed, communicating with each other, is what makes this framework particularly scalable. So, what is the best way to scale up your Node.js app? Which are the most powerful built-in tools for scalability to explore and to “exploit”? And what are the best strategies to go for depending on your specific scenario and scalable architecture needs?   Horizontally Scaling Your Node.js App  Horizontal scaling comes down to... duplicating: Basically, you duplicate your application instance, enabling it to “cope with” a larger number of incoming connections. Note: you can horizontally scale your Node.js app either across different machines or on a single multi-core machine. A word of caution: do keep in mind, though, that this scaling solution might add up unnecessary complexity to your app's infrastructure; it might entail the need to provision and to maintain a load balancer, might make troubleshooting more challenging, and even change the way you deploy your app. That being said: make sure that it's specifically this Node.js scaling solution that your project needs before you go ahead and implement it!   Vertical Scaling If your scalability architecture needs involve nothing more than:   injecting more power  adding more memory   … with no particular “tweaking” applied to the code, then vertical scaling might just be the right answer to the “how to scale your Node.js app” dilemma. Here's why:   by default, Node won't use more than 1.76GB of 64-bit machines' memory in case of a 32GB of RAM machine, for instance, the Node process will limit itself to only a fraction of its memory   Have Multiple Processes Running on The Same Machine Here's another possible answer to your “How to Scale your Node.js app” question: Have multiple processes running on the same port. It goes without saying that this scaling solution calls for some kind of internal load balancing that would distribute the incoming connections across the entire ecosystem of cores/processes. Word of caution! Not sure whether there's any need to add this: keep the number of running processes lower than that of the cores! Hereinafter, let's focus on 2 Node.js built-in tools for scalability that you might want to tap into:   The Cluster Module  Node's cluster module makes a great starter for scaling up your application on a single machine. How does it work precisely? It makes setting up child processes sharing server ports conveniently easy.  Practically, one “master” process will be in charge with spawning all the child processes (and there's one “worker” for each core), those that actually run your Node.js app. Feel free to dig here into more details on the whole process.  Yet, there are certain limitations to this basic scaling solution:   in case one of your child processes “dies”, it doesn't... regenerate itself you'll need to handle the master-worker processes difference... the “old school way”, using an “if-else” block there's no way of modifying multiple processes, at once, on-the-fly!   Note: yet, when it comes to the “dead child processes” drawback, there's... hope. For instance, use this piece of code that would enable the master process to... respawn the “worker”: cluster.on('exit', (worker, code, signal) => { cluster.fork(); }); And voila! This drawback has been taken off your list!    The PM2 Cluster Module Using the PM2 cluster module, “how to scale your Node.js app” dilemma turns into: “Lay back and let the PM2... clusterfy your server for you!” All you need to do is “trigger” this command's superpower: pm2 start app.js -i 4 –name="api" It will instantly create a 4-node cluster for you! Now, here are some more details about what's going on “under the hood” during this process:   the PM2 daemon will take over the ex “master process'” role and spawn N processes (the former “worker processes”) while relying on round-robin balancing moreover, if it's PM2 process manager that you're using, your process gets automatically scaled across all the existing cores (no need to trigger the cluster module for that anymore) also, the same PM2 process manager will ensure that processes restart, instantly, if they happen to crash   You'll just need to write your Node.js app as if it were for single-core usage and the PM2 module will make sure that it gets scaled for multi-core. Note: now if you want to scale your Node.js application further, you might want to consider deploying more machines...    Scaling Across Multiple Machines with Network Load Balancing The underlying process is more than similar to the “multiple core scaling” one, if you come to think of it: Instead of several cores, you'll have several machines; each one will be running one or more processes and will get “backed up” by a load balancer redirecting traffic to each machine in this infrastructure. “And how does a network balancer work, more precisely?” you might ask yourself: Once a request is sent to a node, the balancer sends the traffic to a specific process. And there are 2 ways of deploying your internal balancer:   deploy a machine and set up a network balancer yourself, using NGINX use a managed load balancer (like Elastic Load Balancer); setting it up is conveniently easy and it “spoils” you with all kinds of built-in features, such as auto-scaling   Now if your “How to scale your Node.js app” question turns into a “Isn't it risky to have just one point of failure for my infrastructure?": Just deploy multiple load balancers instead of relying on a single balancer.  They would be all pointing to the same servers, needless to add. Note: for distributing traffic across your “ecosystem” of internal balancers, you could just add several DNS “A” records to your main domain.   How to Scale Your Node.js App: 3 Scaling Strategies to Consider 1. Decomposing “Microservice” is another word for this scaling strategy. For practically you'll be “juggling” with multiple microservices (although their size is of no significant importance, actually). Or multiple applications, with different codebases (and in many cases, each one of them has its own UI and dedicated database). And it's by services and functionalities that you'll be decomposing/scaling your Node.js app. A strategy that can lead to unexpected issues in the long run, but which, if implemented correctly, translates into clear gains for your apps' performance.   2. Splitting Or “horizontal partitioning” or “sharding”, if you prefer. This strategy involves splitting your app into multiple instances, each one responsible for a single, specific part of your app's data! Word of caution: data partitioning calls for a lookup before you carry out each operation; this way you'll identify the right instance of the application to be used. Take this example here: You might want to partition your Node.js app's users by language or area of interest. In this case, a lookup step is a must; you'll need to check that information, first things first.   3. Cloning And this is the easiest strategy at hand for solving your “How to scale your Node.js app” dilemma! Just clone your Node.js back-end application, multiple times, and assign a specific part of the workload to each cloned instance! It's both effective and cost-effective! Moreover, Node's cluster module makes cloning on a single server ideally easy to implement! And this is “How to scale your Node.js app”! See? You have not just one, but several Node.js built-in tools at hand and various strategies to choose from, depending on your scaling needs. Which scaling solution suits you/your app project best? ... Read more
RADU SIMILEANU / May 03'2018
Node.js 10 Is Out: Here Are 10 New Features and Improvements Worth Getting Really Excited About
Have no fear... Node.js 10 is here (since April 24, actually)! And, as expected, this version is planned to grow into the platform's official Long Term Support version (in October 2018); to be supported for 3 years after that date. So? What's in it for you, the back-end web developer? Are there any new features and improvements worth getting really excited about? Which are they and how precisely will they improve the overall developer experience.  Now before we take a deep dive into the “steamy fresh load” of new features, I feel like pointing out that:   it's mostly incremental improvements, applied throughout the entire codebase of the platform, that Node.js 10 ships with … performance, reliability and stability-centered improvements, bubbling up to the back-end developer's experience   But let's name these improvements that ship with the new version of Node.js. Let's talk specific incremental changes, shall we? 10 of the “really worth getting excited about” ones:   1. Error-Handling Improvements And error messages/error-handling improvements do make the majority of semver-major commits (approx. 300) that Node.js ships with. It's a “pledge” made since Node.js 8.0.0 to assign static error codes to all Error objects: “Error messages should be useful, more consistent and predictable”, this has been the “pledge” driving all the sustained efforts geared at improving error-handling. Note: error codes have been included in Node.js 10, making constant error-checking conveniently easier!   2. Enhanced JavaScript Language Capabilities There's an entire list of Node.js 10 language improvements (you can find them all here) worth exploring and... exploiting, I'll outline the highlights only:   you now get to use line and paragraph separator symbols (U+2028 and U+2029) in string literals, that match JSON V8 “introduces”: String.prototype.trimEnd(), String.prototype.trim(), String.prototype.trimStart() prototype.toString() returns the exact “pieces” of the source code text (comments and whitespace here included!) the catch clause of the try statements no longer calls for a parameter   3. The Node.js fs (file system) Has Been Significantly Overhauled And here are the most “dramatic” improvements made during this overhaul:   the type checking and error handling have been improved the code got restructured, for easier maintainability a new experimental fs/promises API got implemented, featuring first-class Promise-based API    Speaking of this new API, its “role” is that of generating a warning at runtime, the very first time that it gets used. Hopefully, things will turn out “bugs-free” so that it can grow from experimental to stable.   4. Node.js 10 Ships with Full Support for N-API  N-API — the ABI stable (Node.js) API for native modules — has leveled up to a stable version in Node.js 10. What does this mean?   it provides a stable module API, one that is not influenced by the changes in Node.js's V8 JavaScript engine the API layer makes upgrading a whole lot easier, streamlining production deployments and... easing module maintainers' lives … and it goes without saying that this bubbles up to native modules' maintenance costs, as well   In short: say goodbye to module breakage!   5. The Assert Module: Explore Some Crucial Improvements  All efforts targetting the assert module have been aimed at easing the internal implementation and improving the developer experience. But let me point out some of these improvements that eventually fleshed out and landed in Node.js 10:   a new “diff” view got implemented, for whenever assertion errors get generated overall the output becomes more descriptive, more... “verbose” (and implicitly more useful) better object comparisons promises support  detailed error messages   6.  Node.js 10 Ships With V8 6.6: Expect a Major Performance Boost Get ready to “exploit” V8 6.6's range of performance improvements to their full potential! Along with its new set of JavaScript language features! From them all, I can't but mention:   the async functions the async generators the promise execution   7. Cryptographic Support  Node.js 10 is the first version of the platform to include OpenSSL 1.x! And this can only translate into: Enhanced protection for your priceless data! Now, if I am to outline just 2 of the OpenSSL features to look forward tapping into, I should definitely mention:   the Polu1305 authenticator the ChaCha 20 cipher   8. The Trace Events Mechanism: Monitoring Your Code's Performance Just Got Easier That's right! Keeping a close eye on how your code's performing and being able to quickly diagnose any emerging issues is easier than ever with Node.js 10! Basically, what these trace events do is enabling that all the diagnostic information output gets collected to a file accessible to the Chrome browsers DevTools utility. No need to use a command-line flag anymore to trigger this whole trace events mechanism underlying Node.js. And since we're here, let me point out to you 2 trace events-related improvements worth getting (really) excited about:   the node.perf.usertiming category got added — its role is that of capturing, in the trace events timelines, all the Performance API user timer marks and measures.  the JavaScript API got implemented, as well; enabling/disabling trace events dynamically is now possible in Node.js:   const trace_events = require('trace_events') const tracing = trace_events.createTracing({ categories: ['node.async_hooks', 'v8'] }) tracing.enable() // do stuff tracing.disable() 9. HTTP and HTTP/2 Improvements  Another thing to get excited about, when it comes to Node.js 10's release, is given by all the incremental improvements made to HTTP and HTTP/2. Let me detail a bit:   when it comes to HTTP, the changes applied range from improved Streams API compatibility to stricter standards support, to improve header and error handling now when it comes to HTTP/2, significant progress has been made for getting it the closest to “stable mode” as possible before Node.js 10 reaches its Long Terms Support cycle. And I'm talking here about improvements made to the way trailing headers requests and responses get implemented and about overall improvements of the internal implementation and the public API   10. Node.js Ships With The Experimental Node-ChakraCore And how does this impact the developer experience? Your experience?   using the JavaScript engine to its full potential tapping into the Time Travel debugging   … gets a whole lot easier for you. You're practically enabled to detect errors way before they even get to “infest” your code. The END! This is how our list of 10 Node.js 10 features worth getting (really) excited about looks like! Do explore them and start your preparations for moving over to this new version of Node.js before October!   ... Read more
RADU SIMILEANU / May 02'2018
How to Use Bootstrap with Angular 4? Here Are 3 Ways to Add It To Your Project 
Here you are now: your Angular 4 front-end app ready to... wow its users! “Almost ready” actually! For it still needs styling... And what better HTML and CSS framework to go for than Bootstrap, right? But how to use Bootstrap with Angular 4 more precisely? How do you properly integrate it into your Angular 4 CLI project? Great news: you have not just one, but 3 options at hand for adding it! Let me get into details:   On Using Bootstrap in Your Front-End Development Process Is there any need to list here the reasons why it's precisely Bootstrap that you're planning to implement into your Angular CLI project? Angular 4, to be more specific. After all, it's the most popular framework for styling websites built in HTML, CSS and modern web & mobile JavaScript frameworks (like Angular here): It's an open source, feature-rich framework that turns front-end development into a such a “breeze”. Basically, it empowers you to build responsive layouts without the need to be a CSS “expert”. And now, let's break down further with the step-by-step “tutorial” on how to use Bootstrap with Angular 4:   Step 1: Create a New Angular Project Using Angular CLI  The very first step to take is obviously setting up a brand new project. Use the Angular Command Line Interface to generate it. But first, install it to on your system: $ npm install -g @angular/cli It's only then, once you've installed its NPM package, that you can go ahead and... generate your new project.  For doing this, just type the following command in your CLI: $ ng new myproject Next, feel free to change into that specific directory and to turn on the web server: $ cd myproject $ ng serve “App works!” This is the message that you should be seeing in your browser right now.   Step 2: Install Bootstrap to Your Project Now that you've launched your new Angular project, it's time to add your Bootstrap library, as well. And you sure aren't nickel and dimed in options. There are 4 ways to add Bootstrap to Angular 4.   Step 3: How to Use Bootstrap with Angular 4 — 3 Different Ways to Integrate It Option 1: Install Bootstrap from CDN And there are 2 particular files that you'll need to install from CDN into your project:   the Bootstrap CCS file the Bootstrap JavaScript file    Note: keep in mind to add the jQuery JavaScript library file, as well! Next, open the src/index.html file and insert the following:   the <link> element to add the Bootstrap CSS file at the end of the head section a <script> element for adding jQuery at the bottom of the body section a <script> element for inserting the Bootstrap JS file at the bottom of the body section   Eager to see “Bootstrap in action” in one of your project's component templates? Then give it a try:   open the src/app/app.component.html enter the following code there:   <div class="container"> <div class="jumbotron"> <h1>Welcome</h1> <h2>Angular & Bootstrap Demo</h2> </div> <div class="panel panel-primary"> <div class="panel-heading">Status</div> <div class="panel-body"> <h3>{{title}}</h3> </div> </div> </div> And it's the following message that this HTML template code should trigger in your browser: “app works!” Note: go for a Bootstrap theme of your choice; once you've downloaded it (from Bootswatch.com for instance), its bootstrap.min.css file will get instantly opened up in your browser. Just copy the file's URL and use it to replace the string assigned to the href attribute of the <link> element, in the index.html file. And voila! It's precisely those colors, defined by your chosen theme, that get displayed in the browser now!   Option 2: Install Bootstrap using NPM And here's another valid answer to your “How to use Bootstrap with Angular 4” dilemma! Simply enter: $ npm install bootstrap@3 jquery –save It's this command that will integrate Bootstrap and jQuery into the node_modules folder of your Angular 4 project directory. Moreover, it will include these 2 dependencies in the package.json file, as well. Once properly installed, you can find both packages at:   node_modules/bootstrap/dist/css/bootstrap.min.css node_modules/bootstrap/dist/js/bootstrap.min.js node_modules/jquery/dist/jquery.min.js   Note! You have 2 options for integrating those files into your Angular 4 project:   add the file paths to the script array and to the file path of the angular-cli.json file add the corresponding <script> and <link> elements to your index.html file   Option 3: Add NG-Bootstrap to Your Project The great thing about this method is that you'll no longer need to add jQuery and Bootstrap dependencies. Ng-Bootstrap comes packed with a set of built-in native Angular directives which are already CSS and Bootstrap's markup-based. Now, getting back to our initial “How to use Bootstrap with Angular 4” question, let's see how we install this NPM package.  For this, just enter the following command in your Angular 4 project directory: npm install --save @ng-bootstrap/ng-bootstrap Next, make sure you also install Bootstrap 4 to your project: $ npm install bootstrap@4.0.0-alpha.6 And, the final step is to add the following files:   jquery.min.js bootstrap.min.js bootstrap.min.css   … to your .angular-cli.json file Now you still need to import the Ng-Bootstrap’s core module — NgbModule — from its @ng-bootstrap/ng-bootstrap package. To do this, just type the following import statement into app.module.ts: import {NgbModule} from '@ng-bootstrap/ng-bootstrap'; All there's left for you to do now is to add the NgbModule to the @NgModuledecorator's imports array.  And since we're here, you'll find some more than “enlightening” info (chunks of code here included!) on the 2 different options at hand for importing the NGBModule: either in your project's child modules  or in your the root module itself … in this article here on Using Bootstrap with Angular.   Using The NG-Bootstrap Components: Which Are They?  With the NgbModule installed into your Angular 4 project, you're now able to use the Ng-Bootstrap components. To leverage them in your app.component.html. Speaking of which, here are the components at hand:   Accordion Alert Rating Tabs Carousel Progressbar Collapse Datepicker Buttons Pagination Typeahead Popover Timepicker Dropdown Modal Tooltip   The END! Does this answer your “How to Use Bootstrap with Angular 4” question?  Which method of adding this front-end framework to your project is more suitable for you? ... Read more
Silviu Serdaru / Apr 30'2018
Which Are the Free Magento 2 Blog Extensions You Should Be Using? 9 Must-Haves
Informative, entertaining, engaging and... a key revenue source! These are just some of your expectations regarding your Magento 2 blog, right? Well, then, get ready to check them all off your “wishlist” digging through my shortlist of can't-believe-its-free Magento 2 blog extensions. From SEO-oriented to shipping-focused features, from functionalities centered on social media to those geared at enhancing page loading speed, these 9 extensions are, each, extra functionalities to inject into your blog. So that it (your blog) should serve your specific needs and help you reach your goals. And that without having to “stretch” your budget (there are only 100% free extensions in this list)... Oh, yes: and they're all wearing the signatures of certified Magento partners! And now, let's get straightaway to these must-have Magento 2 extensions that you should be turbocharging your blog with:   all of them “spoiling” you with configurations that make customization unexpectedly easy … blending perfectly into your blog's design and fitting into your codebase (no need to depend on an “army” of coding experts)   1. Magento 2 Image Slider  Let's review a visual/aesthetics-oriented extension first things first. For, as above-mentioned, a “money-making” blog shouldn't be purely informative and helpful, but... engaging, visually-arresting, as well. So, imagery does play its major part here! Now here are a few of this extension's key features:   supports no less than 10 sliders built-in support for inserting video text, image one of those fully responsive free Magento 2 blog extensions provides tons of animations, with Live Preview, for you to select from supports OWL Carousel  conveniently intuitive UI you're free to display it anywhere on your blog with CMS & Widget   2. Facebook Live Chat  A blog is the ultimate channel of communication with your brand's audience. With your e-store's regular and potential customers. Well, then moving from standard communication to... instant communication is a must if you want to meet their expectations. And this is what makes Facebook Live Chat one of the must-have free Magento 2 blog extensions.  It's that chatbox incorporated into your blog that's powerful enough to turn “just” guests into loyal customers. And now, let me point out to you some of its most powerful features:   there's a Like button and a store profile incorporated into the chatbox user statistics capabilities unlimited History Chat you get to set upcoming events, define greeting text and integrate your e-store's Facebook profile into the chatbox simple backend operations for enabling/disabling the chatbox displayed on your blog familiar UI; a Facebook Messenger Interface-alike chatbox    3. Magento 2 Lazy Load  A must-have extension for your Magento 2 blog if you care enough about the user experience that you provide there. And page loading speed does play a key role in improving/negatively impacting it.    Moreover, besides optimizing your blog's performance, Magento 2 lazy load creates some aesthetically-pleasing image transitions influencing the UX. But let's get deeper into details and “unearth” all those advanced features that make this extension one of the must-haves:   it helps you save your web server resources —  saves bandwidth and minimizes server requests it creates smooth, blurring effect transitions for your lazy load images … and a smooth, visually-pleasing transition when users keep scrolling down your pages it gives your blog a ranking boost by creating friendly code strings it optimizes your blog's page loading time you're free to enable/disable the “Lazy Load” mode for each one of your blog's pages you get to set advanced time point for loading pages   4. Better SEO, One of the Free Magento 2 Blog Extensions You Should Be Using Inject Better SEO into your blog and... propel it in the search engines results! And it's not “just” packed with clever features, but ideally easy to use, as well. Built to fit into your blog's existing code structure and to empower you to customize it to serve your SEO goals in detail. I'm talking here about:   meta descriptions meta keywords   … that this extension's flexible enough to allow you to insert quick and easy. Now that we've settled that Better SEO makes an ideally customizable, blog/store-friendly extension, let's check out its powerful features:   SEO checklist —  a more than handy “TO Do” list, pointing out to you the SEO tasks to complete for reaching a high SEO score its detects duplicate content issues advanced HTML/XML sitemaps —  one for the users, the other one to be used by search engines structured data — implements schema structured data metadata template rules —  easy to define mass and dynamic metadata for your pages, categories, layered navigation provides you with actionable SEO reports rich snippets preview cross links social optimization    5. Exto Analytics Applying a marketing strategy that lacks the proper data-fuel is like aiming at a target... blindfolded.  So, if relying on pure chance doesn't define you and if you want to go beyond the data provided to you by the native Magento 2 reporting functions, go with Exto Analytics. Here are some more heavy-weighing reasons to do so:   real-time mobile dashboard, so you should remain “connected to” your data anytime anywhere convenience at its best when it comes to handling your reports — you get to sort data by specific columns and even to turn off the columns feature itself date range picker —  compare and evaluate your blog's performance on different periods of time your previous data gets added to your reports, as well, once you install the extension a chart, enabling you to visualize all data reports in parallel   6. Magento 2 Admin Theme From user experience to... admin experience. As your own blog's admin, you should also consider making your dashboard's more user-friendly and intuitive. For a high level of convenience on your side will bubble up, eventually, in the experiences that you'll create for your visitors. But let's see specifically what makes Admin Theme one of the best Magento 2 blog extensions to use:   mobile optimized easy to use and quick to customize retina ready clean, neatly structured code a different interface for Login & Forgot Password admin icon font translation-ready   7. Magento 2 Infinite Scroll It does precisely what its name says: it keeps loading content, without interruption, as your blog guests scroll down. Fluidity in the way you present content to your readers translates into improved user experience! And now, let's scan through this extension's specific features:   you can display and easily change the “Show” button, along with its loading text the navigation bar can be placed anywhere on the page you can implement it both on your category page and in the search page the pages that your readers land on get automatically loaded  while scrolling down, your blog guests know, at all time, what section on the blog they're on you get to customize your progress bar to your liking users get to share the links of those specific pages that they reach during their scrolling (for instance, if they're on page 8 of your blog, they can bookmark/share the link of precisely that page)   8. Better Blog  Now, let's imagine that you don't own a blog yet, “only” an e-store. And that now you want to integrate a simple blog, as well. One that should:   be conveniently easy to configure have a beautiful layout design to “wow” your readers with load fast come packed with much-needed backend features, making updating content unexpectedly easy for you, the admin   Checked, checked, checked! The Better Blog is undoubtedly one of the must-go-to Magento 2 extensions no matter the size of your current e-commerce site. Once integrated into your Magento store's backend, you'll get to manage both your store and your blog from the very same place. Here are the main reasons why it still is one of the best Magento 2 blog extensions:   SEO friendly: SEO-friendly URLs, metadata information, XML sitemap  open source code layered navigation, with a significant impact on UX (your blog guests get to quickly track precisely those posts that they're looking for out-of-the-box comment functionality: Disqus Comment, Facebook Comment blog topics built-in product recommendations feature: "Who Bought This Also Bought", "Auto Related Products", "Frequently Bought Together" the option to integrate your store or your blog's sitemap responsive design social sharing buttons blog widgets: show your (recent) posts on your site's homepage (sidebar here included)   In short: you get to integrate a simple blog with your e-store with no need for a third-party framework! Moreover, you'll be managing comments, categories, posts, right from your Magento 2 admin, quick and easy. And you'll get informed each time when a blog guest has posted a comment, not to mention that the extension grows into a powerful “ally”, supporting your SEO efforts. One of the must-have Magento 2 extensions without question!   9. Magento 2 SMTP  A powerful extension to “turbocharge” your Magento 2 blog with so you:   gain total control over your email customization process get enabled to run test sections on your Magento 2 SMTP server   And it does all that by providing your blog with configurable port and host. Now, let's go through its cool features:   it stores all sent emails logs built to support 20+ SMTP service providers enables you to test how well your current email setting's doing it empowers you to customize your emails in the slightest detail   The END! These are the 9 best Magento 2 blog extensions that you should be using. Scan them through, “weigh” their feature loads while setting them against your own needs and growth plans for your blog and... go for the most suitable ones! ... Read more
Adriana Cacoveanu / Apr 27'2018
How React Virtual DOM Works: Why Is It (So Much) Faster than the “Real” DOM?
What's the deal with the virtual DOM? How React virtual DOM works precisely? It's significantly faster, without question, and it brings a whole series of benefits to coding. How come? Which efficiency issues of the “real” DOM does it solve? And what makes the way that React.js manipulates the DOM better than the “standard” way?  Let's get you some answers:   But First: What Is the DOM Anyway? "Document Object Model." It's only but natural that, before we get into details on React and the Virtual DOM, we gain a deep understanding of the DOM itself. Therefore, here's a definition that hopefully sheds enough light on this concept: DOM is a tree-structured abstraction of (or an in-memory representation, if you prefer) a page's HTML code. One that preserves the parent/child relationships between the nodes within its tree-like structure. Any better? The major benefit is the API that it provides, that allows us, developers, to easily scan through the HTML elements of a page and to manipulate them as needed. For instance:   to add new nodes to edit a given node's content to remove specific nodes    And What Is DOM Manipulation More Precisely? It's the very process that enables the content on any of your website's pages to be dynamically updated. Needless to add that it's JavaScript that you would use when handling the DOM. Also, methods such as:   removeChild getElementByID   … are included in the API that the “actual” DOM provides you with.   What Efficiency Challenges Does the "Real" DOM Face?  Now, before we go back to your initial “dilemma” (“how React Virtual DOM works”), let's see why a “virtual” DOM was even needed in the first place. What efficiency issues of the “real” DOM does it address? So, it's JavaScript that we use as we manipulate the DOM, right? And it used to work fantastic back in the days when static UIs would “rule” and the concept of dynamically updating nodes wasn't yet... “invented”. Well, since then things have changed... The DOM manipulation, once the core process of all modern interactive web pages, started to show its limitations. And that because the “real” DOM would update a “target” node along with the entire web page (with its corresponding layout and CSS).  For instance, imagine that: You have a list of items and it's just one of those items that you need to update. Traditionally, the “real” DOM would re-render the entire list and not exclusively the items that receive updates. See? Just think of a scenario where you have an SPA (Single Page App). One with thousands of dynamically generated nodes, that would all need to “listen to” lots of future updates and to re-render them in the UI. It's here that things get discouragingly... slow! The real DOM can't cope with pages carrying thousands and thousands of components to be re-rendered when updates are being passed through.  It's in this context here that the virtual DOM stepped in! And it's React that makes the most of it. Clear enough?   How React Virtual DOM Works: Snapshots, Diffing and Reconciliation Before we get into the “how”, let's shed some light on the “what”. What is the “virtual” DOM? A light-weight abstraction/copy of the HTML DOM, having the same properties as the “real” one. The only difference is that it can't write to the screen like the actual DOM “can”. Also, it's local to React. A copy of the actual DOM that you get to update “intensively” without impacting the real DOM. Note: do keep in mind that it isn't React that introduced this concept since there are plenty of other libraries who're using it.   Snapshots, Diffing and Reconciliation Now, let's get into details on how React virtual DOM works.    a. First of all, React takes a virtual DOM snapshot before doing any updates. b. It will then use it (this record of the DOM state) to compare it against the updated virtual DOM, before applying any changes to the actual DOM itself.   And it's a “diffing algorithm” that supports all this comparing and enables React to identify any changes. To detect the updates that have been applied. Also, the entire  process is called “reconciliation”: Whenever updates need to be made to the actual DOM, React updates the Virtual DOM first, and then, once it has done its compairing, it syncs the Real DOM. In other words: before applying any of the requested updates, React makes a copy of the virtual DOM, that it will then set against the updated virtual DOM (diffing). It's during this diffing-reconciliation process that React detects the changes that have been applied and identifies the objects to be updated. And it's precisely those objects that it will update in the actual DOM. The huge benefits?   virtual DOM updates a whole lot faster it updates exclusively the “target” nodes, leaving the rest ones of the page alone   Summing Up To recap, let's try and sum up this whole “How React Virtual DOM Works” guide here to its bare essentials.  So, here's how React updates the DOM in 3 simple steps:   first, it applies the given updates to the whole Virtual DOM then, it compares it with the snapshot of the virtual DOM that it will have taken, using an algorithm called “diffing” during this whole process of comparing and spotting any changes/contrasts then, it's specifically (and exclusively) those changed elements that it updates in the actual DOM   The END! Have I managed to make this process any clearer for you? Can you now see what's “under the hood” of the way React updates DOM? And the specific reasons why it's so much faster than the real DOM manipulation? ... Read more
RADU SIMILEANU / Apr 26'2018