What can you do to speed up your Magento 2 store on mobile devices?
For let's face it: Magento 2's “ecosystem” of third-party extensions and overall its modular architecture is convenience at its best for any developer! For any eStore owner. It empowers them both to start small and then scale it up to their most daring goals. Yet, all this power placed in your hand does come at a cost: reduced speed.
And top speed is crucial if you're determined to deliver a great mobile user experience.
So, what are the tweaks that you can make to boost your eStore's performance?
Luckily, there are plenty, ranging from:
well-known (and too frequently underrated) best practices, like optimizing your product images
to slightly more in-depth “tweaks”, like inlining critical CSS
But, let's dive right in! Here's your “emergency kit“ of 5 solutions to apply to your Magento 2 store for improving its performance on mobile:
1. Reduce Page Size to Increase Page Loading Speed
And it's still those “too obvious to be truly effective” type of techniques that have the biggest impact on an eStore's performance on mobile devices:
Lower your web page size and it will make a drastic difference for your mobile users' experience with your online store; especially for those accessing your site from mobile devices with low bandwidth.
Now here are a few simple, yet effective tweaks that you can make to reduce page size:
1.1. Use GZIP to Compress Your Pages
A handy “trick” that you can perform is to enable GZIP (if it's not already enabled) and let it “work its magic” on your web page's size.
It will compress:
fonts
CSS files
external scripts
JS
… cutting your pages' “weight” down by almost 70%.
Note: put any of your front-end pages to the Google PageSpeed Insights “test”; take note of the GZIP-related warnings popping up and ensure that the CSS/JS compression feature is enabled.
1.2. Enable JavaScript/CSS Files Minification
Here's another built-in Magento 2 feature that all you need to do is... trigger to speed up your Magento 2 store on mobile devices: CSS/JS files minification.
Note: do keep in mind, though, that it works in production mode only (so not in default or developer mode, as well)!
Here's how you enable it:
Navigate to the backend menu
Stores > Configuration > Advanced > Developer
Set your app/site's production mode:
php bin/magento deploy:mode:set production
Note: not sure what mode your eCommerce site's running on now? Enter the following command to identify its current mode:
php bin/magento deploy:mode:show
1.3. Optimize Your Product Pages
And the more crowded your product catalog is, the more important this solution becomes!
“Are you implying that I should take each and every one of my product images and optimize them... one by one?” you might ask yourself.
Definitely not! Since there are at least 2 easy solutions that you could go for:
you can use a content delivery network (CDN) as it will take the image optimization “burden” off your back
you can leverage the Google PageSpeed (GPS) server extension; it will compress your images in no time, among other “tricks” that it performs to speed up your Magento 2 store on mobile
2. Reduce The Server Response Time to Speed up Your Magento 2 Store
Optimizing your server's response time (or “time to first byte”) is another critical tweak that you can do to boost your Magento 2 store's speed.
Set your “target” to 0.5s, the time a browser would need to wait for your website's server response.
“But why bother, since Magento provides me with full-page cache functionality right out of the box”, you might wonder.
That's true, but just consider particular pages, such as checkout, customer pages, cart, that this pre-built functionality can't “work its magic” on.
2.1. Run a Throughout Audit on Your Third-Party Extension "Load"
Start reducing your server response time with a basic, yet so very effective step:
Audit your entire modules infrastructure!
identify any issues with your current plugins and (if any) look for a patch or replace them with more performant ones
turn them on and off just to detect any negative impacts on your Magento 2 site's performance
Note: as a rule of thumb, try keeping your Magento 2 third-party extensions to a minimum! Trim down your collection of modules keeping only the must-have ones; otherwise, its weight will affect your eCommerce site's performance!
2.2. Use Magento 2 Profiler to Detect Any Server Performance Issues
“What's a profile?” you ask.
A program geared at identifying just how much time a code block needs to execute.
Using a profile you'll be actually drilling deep(er) into your Magento 2 store's internals to detect the very root cause of your bad server response time!
2.3. Consider Upgrading Your Hosting Plan
Is it time you upgraded your hosting server? More RAM and CPU will always have a huge impact on your eCommerce website's performance, you know.
So, how do you know whether it's time to upgrade?
Just install a brand new Magento 2 website on your current server. If it's speedier than your live website, there's no need to change your current hosting plan.
In this case, you'll only need to focus on the other tweaks included in this list here to speed up your Magento 2 store on mobile.
2.4. Use Varnish as a Full-Page Cache (FPC) Solution
Another trick for improving Magento 2's performance is to leverage Varnish, the software that caches and serves content.
The good news is that Magento 2 supports it natively.
And here's how you trigger its “power”:
navigate to the backend menu
Stores > Configuration > Advanced > System > Full Page Cache
Note: you'll need to enter a hostname, port and Varnish export configuration; if in doubt, ask your system admin for a hand to set everything up properly.
3. Load and Render Above-the-Fold Content First
Prioritize the content that appears before scrolling down! It will make all the difference when it comes to your Magento 2 eStore's page loading time!
And now, here are the techniques at hand for loading and displaying this content first:
3.1. Defer Loading JavaScript
Moving all your JS code to the bottom of the page (“beneath the fold”) will implicitly make your AF (above-the-fold) content load quicker.
You'll basically postpone the time-consuming parsing of JS code and thus speed up your Magento 2 store on all mobile devices!
The good news is that there already are Magento 2 extensions that do the job for you. They'll move all your non-critical JS scripts beneath the fold!
3.2. Inline Critical Above-the-Fold CSS
“But what about the above-the-fold CSS?” you might legitimately ask yourself.
How do you approach these critical files? For you definitely can't place ALL your CSS at the bottom of the page, now can you?
Well, first you:
extract/isolate precisely this “critical” CSS
then you inline it straight to the HTML; right between <head> and </head> tags
This way, it will get loaded and rendered first (before the non-critical CSS), along with the rest of the above-the-fold content.
Note: you might be tempted to go for one of those tools “promising” you to extract this CSS for you. Unfortunately for you, manually setting the critical CSS for each one of your pages (homepage, checkout, category etc.) is the right way to do it.
4. Leverage the Power of HTTP/2
By moving your Magento 2 website over to HTTP/2 you'll grant your eStore users a secure and faster-browsing experience.
Not to mention the impact that it will have particularly on the experiences of those customers using a slow mobile network to access your online store.
The tremendous news is that Magento 2 co-exist with HTTP/2 by default. Yet, there are 2 conditions that you need to make sure your online store meets:
your hosting server should already support HTTP/2
your eCommerce web pages should be served via SSL
Note: run your own "investigation" and look for some suitable extensions providing server pushes.
5. Magento 2 Performance Optimization: Disable JS Bundling
But why would you want to disable a Magento 2 feature that actually lowers the HTTP requests made by your browser for loading and rendering a web page?
Because it comes with its own side-effects, the main one being the oversized JS file that this feature generates, of about 5-10 Mb.
Moreover, it's proven that downloading this huge external file takes more time than the time you'd actually be saving by reducing the no. of HTTP requests.
Now that we've tackled the “Why”, let's focus on the “How”, as well. Here's how you disable JS bundling:
go to your website's backend menu
Stores > Configuration > Advanced > Developer
and apply the following configuration:
Note: there's no need to disable this JS files grouping feature if you're already using HTTP/2!
The END! These are but 5 of the handiest solutions that you could use to speed up your Magento 2 store on mobile. As you can see, the list includes nothing more than predictable “tweaks” and well-known best practices that you should stick to.
Silviu Serdaru / May 23'2018
Just imagine: a user asks Amazon Alexa to read out loud to him/her the headline of your latest blog post! Or maybe to look for a specific section on your Drupal site! Or, even better: quit imagining this and start implementing it instead! Right on your website. And here's how you integrate Alexa with your Drupal 8 website via the Alexa integration APIs.
A 7-step tutorial:
on how to get Alexa to “talk to” your site users/online customers
on turning your site's content into the needed “raw material” for setting up your custom Alexa skills
on how you can leverage Drupal 8's outstanding third-party integration capabilities to “fuel” your implementation plan with
So, here's how it's done:
But Why Precisely Amazon Alexa over Other Voice Assistants?
Because Amazon Alexa stands out with its top notch integration capabilities.
Its integration APIs make it possible for this particular voice service to be “plugged into” various devices and web services.
As simple as that! Alexa's more than just a voice assistant making voice (obviously!) interaction possible:
It's a voice service that empowers you to integrate it even with your Drupal 8 website quickly and smoothly, via its own built-in APIs!
Introducing Alexa: The Drupal Module for Amazon Alexa Integration
With Alexa “doing its own part” and the voice service bringing its Alexa integration APIs into the equation, it was only fair that the Drupal community should play their own part, as well.
The result of their sustained efforts? The Alexa Drupal module:
which provides an endpoint for your Drupal 8 website, where it would receive the vocal user requests “stored” in the Alexa Skills
"user requests” which get converted into text strings before getting sent over to the Alexa module on your Drupal site
Note: do keep in mind that the Alexa module is still under development, but with a more than promising, long-term future ahead of it.
For now, it offers basic integration with Amazon's Alexa. And this is precisely why you'll need to build a custom module, as well, to integrate Alexa with your Drupal 8 website.
But more details on this, in the tutorial here below:
Integrate Alexa With Your Drupal 8 Website: A 7-Step Guide
Step 1: Make Sure Your Site Uses HTTPs
In other words: make sure your Drupal 8 website's “easily detectable” by Amazon's servers!
The very first step to take will be to switch your site over to an HTTPs domain (a step you can skip if your site's already on HTTPs)
Step 2: Install the Alexa Module
Go “grab” the Alexa Drupal module and get it installed and enabled on your website.
Step 3: Set Up Your Alexa Skill
With your dedicated Drupal module ON, it's time to focus on all the needed setting up to be done on the Amazon Developer site. And the very first step to take is to create your own new Alexa Skill in the Skills Kit there.
Step 4: Copy & Paste Your Application ID
And this is no more than a quick 2-step process:
first, you copy the Application ID provided in your “Skill information” section, on the Amazon developer site
then you submit it to your website's configuration at /admin/config/services/alexa
Step 5: Configure Your New Alexa Skill
A key 3-part step to take when you integrate Alexa with your Drupal 8 website, where you:
give a name to the Alexa skill (in the Alexa app) to be triggered
set up an Invocation Name for your users to utter for “activating” your newly created Alexa skill
set up the custom vocal commands or “intents” that Alexa will need to respond to
For this, you'll need to go to the Amazon Development website again and access the “Skill Information” section.
Note: maximize the odds that it's precisely those intents that your users will utter by adding more phrasings of the very same question/vocal command.
Another note: this flexibility proves that you get to harness the power of... variables when setting up your custom intents. “Variables” that you'll use with the custom module that you're going to build at the following step of the process:
Step 6: Create a Custom Module for Triggering The Right Responses to Your Intents
What should happen when your custom intents get invoked and sent through to your Drupal 8 website?
You'll need to create a custom Drupal 8 module that would handle responses.
For this, insert the following info in the demo_alexa.info.yml file:
name: Alexa Latest Articles Demo
type: module
description: Demonstrates an integration to Amazon Echo.
core: 8.x
package: Alexa
dependencies:
- alexa
Note: Do keep in mind to add the Alexa Drupal module as a dependency!
Now, time to build the custom module itself:
create a file at src/EventSubscriber/
name it RequestSubscriber.php
As for the code that will “populate” your module, first of all it's the namespace and use statements that you'll need to create:
namespace Drupal\demo_alexa\EventSubscriber;
use Drupal\alexa\AlexaEvent;
use Symfony\Component\EventDispatcher\EventSubscriberInterface;
use Drupal\paragraphs\Entity\Paragraph;
Then, you'll need to set up your main class, as well as a function to trigger the event:
/**
* An event subscriber for Alexa request events.
*/
class RequestSubscriber implements EventSubscriberInterface {
/**
* Gets the event.
*/
public static function getSubscribedEvents() {
$events['alexaevent.request'][] = ['onRequest', 0];
return $events;
}
Next, set up the function “responsible” for giving responses to each one of your custom intents.
With the code for your responses at hand, the very last file that you'll need to focus on is the demo_alexa.services.yml:
services:
alexa_demo.request_subscriber:
class: Drupal\demo_alexa\EventSubscriber\RequestSubscriber
tags:
- { name: event_subscriber }
Note: Remember to enable your demo Alexa module, then to navigate to the Amazon Developer site once again!
Step 7: Test Out Your New Alexa Skill
Another essential step to take when you integrate Alexa with your Drupal 8 website is testing your newly created Alexa skill.
And there's even a Test tab on https://developer.amazon.com for that!
Click on this specific tab, ensure that your new Alexa skill is enabled and thus ready to be tested and... see whether you'll get the right responses!
The END! This is the “how it's made” for getting Amazon Alexa to “talk to” your Drupal 8 website via:
the Alexa integration APIS
the Alexa module
a custom-built Drupal 8 module
RADU SIMILEANU / May 18'2018
It's undebatable: Node.js has practically laid the foundation of the real-time web! The real-time, two-way connection web apps have revolutionized the old web response paradigm. The one where it was just the client who could initiate communication, never the server, as well. Even so, there are certain cases when using Node.js is not the best decision you could make.
Specific use cases for which the otherwise flexible and revolutionary web server technology turns out not to be... unsuitable. So:
“When shouldn't I use Node.js?”
You might legitimately ask yourself.
Here are the 3 bad use cases for this JavaScript runtime environment. Scan them through, take note all the factors that I'll be outlining and think them through before rushing to use Node.js to power your next project with.
1. A CPU-Heavy Application: Using Node.js Is Simply a Bad Idea
Face it, deal with it and... adjust your decisions to it:
There are plenty of better solutions (other than Node.js) for powering your CPU-intensive app. It's just not the best technology at hand when it comes to heavy computation.
Now here's why, by using Node.js, you'll only end up “sabotaging” its very advantages, instead of turning it into a true “horsepower” for your app, as you might expect:
Node.js leverages an event-based, non-blocking I/O model, using a single CPU
hence, all that intense CPU-processing activity will actually block the incoming requests
… since the thread will get “held up” with number-crunching
The direct effect of “exploiting” Node.js in the context of heavy server-side computation?
The very benefits of its event-driven, non-clocking I/O model would get practically... nullified in the context of CPU-intensive operations.
Given this, why would you stubbornly stick to Node.js, when there are other technologies more suitable for building your CPU-heavy software with? With better results?
2. A Simple CRUD (or HTML) Application
No need to get your hopes high when using Node.js with a basic CRUD or HTML application:
It might turn out to be just “slightly” more scalable, yet don't expect a traffic flood just because it's Node.js-powered.
In short: use cases like this one, where data's provided, straightforwardly, by the server and where there's no need for a separate API, render Node.js superfluous.
There are other frameworks suited specifically for this type of projects (take Ruby for instance).
Using Node.js in this case would be like driving a Formula 1 car while... stuck in rush hour traffic.
3. A Relational Database-Backed Server-Side App
Why isn't Node.js your best choice for a relational data access type of project?
Because its relational database tools aren't as reliable, robust and easy to work with as other frameworks' toolboxes (take Rails for instance!).
Rails, for example, would “spoil” you with:
already matured Data Mapper and Active Record data access layer implementations
out-of-the-box data access setup
DB schema migrations support tools
… and the list goes on
In short: if there already are frameworks perfectly “equipped” for this type of project “scenarios”, why would you stick to using Node.js? Since its relational DB toolbox is not (yet) as advanced?
In Other Words...
With these 3 “bad” use cases for Node.js “exposed”, allow me to put together a short “inventory” here, one including all the “gotchas”, aspects to consider before kicking off your Node.js project and limitations to be aware of:
Node.js hasn't been built with the “solving the compute scaling” issue in mind
… but it has been created to solve the I/O scaling issue instead
excepting contexts of CPU-heavy operations, Node.js still is the best technology at hand for powering your real-time, scalable web apps with
do reconsider your decision of using Node.js if for developing your piece of software you depend on some kind of threading model
there are also poor quality packages available in npm, free to use in your Node.js application; do keep this in mind as you dig deep into the “load” of Node.js packages
Node.js will never be “the best choice” for event loop-blocking use cases (take asynchronous parsing XML, for instance)
… nor for powering apps relying on intense computation
Node'js “worker” is geared at solving HTTP server calling issues (rather than intense computing issues)
The END!
RADU SIMILEANU / May 17'2018
Not exactly the “jumping on the latest trend" type? Therefore, you're still a bit hesitant to get on the Node.js bandwagon? And this because you still haven't got some crystal-clear answers to your “What is Node.js used for?” question?
You're legitimately hesitant then! For everyone's gone crazy over it these days, but you know that there must be certain factors to consider.
Specific use cases for Node.js, as well as cases when... well... it just isn't the best idea.
You're well aware that there are some particular applications that call precisely for this JavaScript runtime environment. And you need to be 101% sure that your project fits the “profile”. One of the “best use cases of Node.js”.
But let's not meander any longer and get you some clear answers to your questions instead:
Why should you even consider Node.js one of the possible technologies to choose from for your next project?
Which are the ideal use cases for Node.js?
1. Why Would You (Even) Consider Node.js for Your Project?
Why should Node.js be on your list of... options, in the first place? On your shortlist of technologies that might power your next project?
There must be some “bundles of convenience”, some major benefits that you can “reap” from using it, right? Benefits that make it worth building your project using this specific environment.
Well, let us shed some light on these clear advantages of developing your application in Node.js:
it's a Google JavaScript engine (a runtime environment) which, translated into clear benefits, means that: it's fast and scalable web apps that you'll build using it
speaking of its scalability: Node.js's built to scale on individual process basis and to leverage multi-core processing on modern servers (via its Cluster module)
it's JavaScript-based... so the “pool” of developers with some kind of JS knowledge/expertise is conveniently large: 99,9% of them know at least “some” JavaScript
… no to say that this turns Node.js into the perfect choice if there are beginner developers in your team (even junior developers are at least familiarized with JS)
any developer will be able to gain a quick understanding of your Node.js app's codebase
it speeds up developers' work with a collection of modules (Grunt, NPM, etc.)
it provides your development team with a great package manager, npm, with a widely available and increasingly heavy “load” of open-source tools
it's backed and actively maintained by an ever-growing community ready to... support you; the knowledge base that your development team needs to get a grip on Node.js is accessible and... free
it's open-source: you'll benefit from a single, free codebase
it's just... fast, there's no point in debating over this: the event loop and Google's innovative technologies are “turbocharging” it
it cuts down costs, as simple as that: Node.js enables your team to use the same language on the front-end and on the back-end, which translates into boosted efficiency, cross-functionality and implicitly... reduced costs
you're “tempted” with a whole range of hosting options to select from
it supports native JSON: it's in this specific format that you'll get to keep your data stored in your
database
Now if I was to trim this list to just 3 answers to your “what is Node.js used for?” dilemma, it's these 3 key benefits that I'd stubbornly stick to:
performance: Node.js is simply... fast, faster than other JS languages; moreover, as a runtime language it has enhanced JavaScript with new capabilities
versatility: from backend to front-end apps, to clips to... pretty much everything in between, Node.js enables you to build any kind of project that you have in mind; as long as it's written in JavaScript, of course
agility: regardless of your/your team's level of JavaScript expertise, Node.js empowers you to kick-start your project, to get it up and running in no time; it's developer productivity-oriented (just think same language for both back-end and front-end!!!), with a low learning curve
2. What is Node.js Used for? 7 Great Use Cases
Now back to the very question that started this post:
“What exactly is Node.js good/used for?”
There are specific app projects that this server-side JavaScript technology makes the best choice for:
2.1. Chat servers
And generally speaking any type of fast-upload system challenged to respond, in real-time, to an “avalanche” of incoming requests.
2.2. Real-time apps
This is the use case that Node.js “rocks at”. Where you get to make the most of its capabilities.
Apps geared at processing high volumes of short messages, where low latency becomes critical, make the best possible answer to your “what is Node.js used for?” question.
Here's why:
it enables sharing and reusing Node.js packages that store library code
it processes ideally fast: quick data sync between the client and server
it's perfectly “equipped” to cope with multiple client requests
In short: if scalability and real-time data processing are 2 critical factors to consider when choosing the best technology for your project, think Node.js!
It's built to suit specifically those situations that are “overly demanding” of our servers.
2.3. Video conference apps
...applications using VoIP or specific hardware.
Projects involving intense data streaming — audio and video files — make the best use cases for Node.js.
2.4. Instant-messaging, live-chat apps
2.5. Highly scalable apps
Think Trello or Uber-alike apps, that depend on a server-side server technology enabling them to scale out on multi-CPU servers.
Node.js, thanks to its cluster-based architecture, will always make the best choice for apps depending on a technology that would spread out the load across a multi-core server.
Note: speaking of scalability requirements, should I also mention that Node.js is... conveniently lightweight, too?
2.6. eCommerce transaction software and online gaming apps
“What is Node.js used for?”
For powering apps for which online data is of critical importance. Like these 2 examples here!
2.7. Server-side applications
Being an event-driven model of programming, the flow is determined by messages, user actions, and other specific events of this kind.
3. Afterword
Does this clear the picture for you a bit more?
As a conclusion or “final” answer to your “what is Node.js used for?” type of dilemma, the key takeaway from this post here is that:
Node.js is used primarily for web applications, but it's starting to get used more and more often for developing enterprise apps, too, thanks to its versatility.
What does the future have in store for this increasingly (still) popular technology:
rising potential for Node.js to be used for building IoT solutions
…. for “experimenting” with enterprise data
more and more big names (adding to Netflix, IBM, Amazon, Uber, LinkedIn, etc.) choosing it over legacy languages such as Java or PHP
RADU SIMILEANU / May 10'2018
Here's how the ideal decoupling Drupal scenario looks like:
Stripping Drupal to its essential role, that of a robust and flexible content repository, no Drupal expertise needed. Then using it to back your front-end with; one that you'd be free to build by leveraging any modern (JavaScript) technology of your choice.
… a Drupal back-end content store that would still preserve all its content editing and managing functionalities, needless to add.
Luckily, this is no longer “daydreaming”. Not since Reservoir, the headless Drupal distribution, has been available.
Here are some of its “promises” or well-known challenges, if you prefer, that this distribution's geared at solving:
to make Drupal far more accessible (cutting the intimidating Drupal setting up and configuration out of the equation) to developers of all stripes
to empower developers with all the best practices for building their Drupal-backed front-ends quick and easy
to provide an opinionated starting point enabling any developer to build a Drupal content repository backing his non-Drupal application with... no Drupal knowledge needed, actually
Your Current Situation: Why Would You (Even) Consider “Headless” Drupal?
Here you are now, dealing with the pressure of:
having to deliver content agnostically across any given channel and device: single-page JS apps, mobile apps, digital signage, AR and VR-driven content, IoT apps etc...
… all while storing it (content) in one single place
providing your editorial team with a... way to edit, manage and overall administrate content conveniently easy, via an editor-friendly UI
… independently of the development team, of course
finding a way to enable your developers to easily send content across this entire “ecosystem” of channels, devices and platforms
In other words: you're grappling with the challenge of making Drupal ideally accessible to your (non-Drupal) developers; so they can easily build their Drupal-based content store enabling them to deliver content to any given device.
… to serve it to any given app/site.
And this definitely calls for a decoupling Drupal approach.
Decoupling Drupal: The Most Discouraging Challenges You Must Be Facing
Let's assume that you're already considering headless Drupal as a solution for your current challenge, that of delivering content to multiple channels, devices, platforms.
Whether you're planning to decouple Drupal for:
building a Drupal-backed front-end, leveraging one of your modern JavaScript frameworks of choice
or using it as a content store for your non-Drupal app
Then, it's these specific challenges that you must be facing right now:
your non-Drupal developers are having trouble maneuvering Drupal content; they're not familiar with all the dials and knobs needed for making the most of Drupal's REST API
Drupal's serialization format is... alien to them
there's no starting point or well-defined best practices for non-Drupalists, that would ease their way to turning Drupal into a content repository
… one that they could back their front-ends with
True story!
And still, there is hope...
5 Reasons For Being “Skeptical” About Distributions
You must be legitimately cautious right now when it comes to using an API-first distribution for Drupal. And that's due to some bad experiences with... distributions.
Now let me try and guess some of your “fears” regarding Reservoir:
that it might turn out to be overly complex
that you risk getting “stuck with” architectural debt
that its maintainers might someday lose interest in it
that it's built primarily for other use cases, for scenarios different from your own decoupled Drupal implementation project
that you risk “inheriting” bugs in features that you haven't even used
And the list of reasons why you're not yet jumping on this decoupling Drupal trend could go on...
Introducing Reservoir: The Headless Drupal 8 Distribution! How Is It Different?
Before putting it into the spotlight and giving it a “full scan”, let me try to read your mind and identify the questions that you must be asking yourself right now:
“How precisely do I use Reservoir as a content store backing my front-end website or app?”
“Which are the bare essential Drupal modules and core functionality that this distribution comes packed with?”
“How can I leverage these ready-to-use components for decoupling Drupal?”
And now that we've put your valid queries into words, let me try and define Reservoir for you:
1st definition: a distribution for decoupling Drupal
2nd definition: an ideally flexible and minimalist tool empowering developers of all backgrounds to build content repositories for their apps to “consume”
3rd definition: the headless Drupal 8 distribution “specialized” in manipulating content and interacting with it via HTTP APIs
4th definition: a Drupal-based content store with all the web service APIs backed into, so that any developer can jump straight to building his front-end app
5th definition: simply a... content repository; one that just happens to be Drupal-based, as the Reservoir project's maintainers admitted.
Now the 4 key goals behind this distribution for decoupling Drupal — besides that of providing a simple way of building a content repository enabling you to use any technology for your front-end — are:
on-boarding developers or all stripes, making Drupal ideally accessible to... anyone
providing a much-needed opinionated starting point for any type of decoupled Drupal implementation; no Drupal knowledge required
keeping itself away from the scope creep that end-user facing products risk falling into
serving a specific decoupled use case
Decoupling Drupal Made Easy & Accessible: Key Reservoir Features
“But how does Reservoir make building Drupal-based content repositories so much easier than other... distributions?”
“How precisely does it make Drupal accessible to non-Drupal developers, as well?”
You're more than entitled to ask yourself that...
Therefore, let me outline here the out-of-the-box Reservoir features geared at speeding up any decoupled Drupal implementation. Regardless of the developer's background:
an opinionated selection of API-first/ web services modules — Reservoir offers each developer a much-needed starting point/”push” so that he can ramp up and have his content stores built in no time: Simple OAuth modules here included
quick and easy access to the content back-end via JSON API
auto-generated documentation (API documentation), that gets automatically updated, as well, as you're browsing it, as your content model changes
OpenAPI format export, that supports hundreds of tools integrating with the OpenAPI specification
easy-boarding/tailored UI — expect a “welcoming tour” once you've installed Reservoir, one focused on getting you familiar with modeling and managing content, web service APIs, mapping out new content models etc.
a permission system and content editing UI empowering your editorial team to easily manage content
SDKs, libraries and references — included in the Waterwheel ecosystem — so that your development team can skip the time-consuming API learning phase and jump straight to “attaching” Drupal back-end content to their front-end apps
Note: Reservoir, the distribution for decoupling Drupal, deliberately shakes off some of Drupal's functionality that's irrelevant for content repositories (modules such as Breakpoint, Views, Content, the user-facing front-end etc.)
For we couldn't even talk about speeding up your decoupled Drupal project when there's an unnecessarily heavy weight of Drupal modules and features “dragging down” the whole implementation process, right?
Wrapping Up: What Reservoir Aims At Is...
... enabling your developers to jumpstart building self-hosted content repositories capable to serve any given front-ends.
Front-ends that they get to build independently, tapping into the technologies they prefer, on a project-by-project basis.
Pretty convenient, don't you agree?
Adrian Ababei / May 09'2018
Whether it's the increasingly challenging workload or you simply want to enhance your Node.js app's tolerance to failure and availability, there comes a time when you just need to scale it up, right? To “squeeze” the best performance out of your entire infrastructure of... nodes. Well then, here's how to scale your Node.js app:
And scaling up your web back-end app at different levels — overall improving its throughout — sure isn't an afterthought with Node.js:
Scalability is built in the very core of the runtime.
And the infrastructure of nodes, strategically distributed, communicating with each other, is what makes this framework particularly scalable.
So, what is the best way to scale up your Node.js app?
Which are the most powerful built-in tools for scalability to explore and to “exploit”? And what are the best strategies to go for depending on your specific scenario and scalable architecture needs?
Horizontally Scaling Your Node.js App
Horizontal scaling comes down to... duplicating:
Basically, you duplicate your application instance, enabling it to “cope with” a larger number of incoming connections.
Note: you can horizontally scale your Node.js app either across different machines or on a single multi-core machine.
A word of caution: do keep in mind, though, that this scaling solution might add up unnecessary complexity to your app's infrastructure; it might entail the need to provision and to maintain a load balancer, might make troubleshooting more challenging, and even change the way you deploy your app.
That being said: make sure that it's specifically this Node.js scaling solution that your project needs before you go ahead and implement it!
Vertical Scaling
If your scalability architecture needs involve nothing more than:
injecting more power
adding more memory
… with no particular “tweaking” applied to the code, then vertical scaling might just be the right answer to the “how to scale your Node.js app” dilemma.
Here's why:
by default, Node won't use more than 1.76GB of 64-bit machines' memory
in case of a 32GB of RAM machine, for instance, the Node process will limit itself to only a fraction of its memory
Have Multiple Processes Running on The Same Machine
Here's another possible answer to your “How to Scale your Node.js app” question:
Have multiple processes running on the same port.
It goes without saying that this scaling solution calls for some kind of internal load balancing that would distribute the incoming connections across the entire ecosystem of cores/processes.
Word of caution!
Not sure whether there's any need to add this: keep the number of running processes lower than that of the cores!
Hereinafter, let's focus on 2 Node.js built-in tools for scalability that you might want to tap into:
The Cluster Module
Node's cluster module makes a great starter for scaling up your application on a single machine.
How does it work precisely?
It makes setting up child processes sharing server ports conveniently easy.
Practically, one “master” process will be in charge with spawning all the child processes (and there's one “worker” for each core), those that actually run your Node.js app.
Feel free to dig here into more details on the whole process.
Yet, there are certain limitations to this basic scaling solution:
in case one of your child processes “dies”, it doesn't... regenerate itself
you'll need to handle the master-worker processes difference... the “old school way”, using an “if-else” block
there's no way of modifying multiple processes, at once, on-the-fly!
Note: yet, when it comes to the “dead child processes” drawback, there's... hope. For instance, use this piece of code that would enable the master process to... respawn the “worker”:
cluster.on('exit', (worker, code, signal) => {
cluster.fork();
});
And voila! This drawback has been taken off your list!
The PM2 Cluster Module
Using the PM2 cluster module, “how to scale your Node.js app” dilemma turns into:
“Lay back and let the PM2... clusterfy your server for you!”
All you need to do is “trigger” this command's superpower:
pm2 start app.js -i 4 –name="api"
It will instantly create a 4-node cluster for you!
Now, here are some more details about what's going on “under the hood” during this process:
the PM2 daemon will take over the ex “master process'” role and spawn N processes (the former “worker processes”) while relying on round-robin balancing
moreover, if it's PM2 process manager that you're using, your process gets automatically scaled across all the existing cores (no need to trigger the cluster module for that anymore)
also, the same PM2 process manager will ensure that processes restart, instantly, if they happen to crash
You'll just need to write your Node.js app as if it were for single-core usage and the PM2 module will make sure that it gets scaled for multi-core.
Note: now if you want to scale your Node.js application further, you might want to consider deploying more machines...
Scaling Across Multiple Machines with Network Load Balancing
The underlying process is more than similar to the “multiple core scaling” one, if you come to think of it:
Instead of several cores, you'll have several machines; each one will be running one or more processes and will get “backed up” by a load balancer redirecting traffic to each machine in this infrastructure.
“And how does a network balancer work, more precisely?” you might ask yourself:
Once a request is sent to a node, the balancer sends the traffic to a specific process.
And there are 2 ways of deploying your internal balancer:
deploy a machine and set up a network balancer yourself, using NGINX
use a managed load balancer (like Elastic Load Balancer); setting it up is conveniently easy and it “spoils” you with all kinds of built-in features, such as auto-scaling
Now if your “How to scale your Node.js app” question turns into a “Isn't it risky to have just one point of failure for my infrastructure?":
Just deploy multiple load balancers instead of relying on a single balancer.
They would be all pointing to the same servers, needless to add.
Note: for distributing traffic across your “ecosystem” of internal balancers, you could just add several DNS “A” records to your main domain.
How to Scale Your Node.js App: 3 Scaling Strategies to Consider
1. Decomposing
“Microservice” is another word for this scaling strategy. For practically you'll be “juggling” with multiple microservices (although their size is of no significant importance, actually).
Or multiple applications, with different codebases (and in many cases, each one of them has its own UI and dedicated database).
And it's by services and functionalities that you'll be decomposing/scaling your Node.js app. A strategy that can lead to unexpected issues in the long run, but which, if implemented correctly, translates into clear gains for your apps' performance.
2. Splitting
Or “horizontal partitioning” or “sharding”, if you prefer. This strategy involves splitting your app into multiple instances, each one responsible for a single, specific part of your app's data!
Word of caution: data partitioning calls for a lookup before you carry out each operation; this way you'll identify the right instance of the application to be used.
Take this example here:
You might want to partition your Node.js app's users by language or area of interest. In this case, a lookup step is a must; you'll need to check that information, first things first.
3. Cloning
And this is the easiest strategy at hand for solving your “How to scale your Node.js app” dilemma!
Just clone your Node.js back-end application, multiple times, and assign a specific part of the workload to each cloned instance!
It's both effective and cost-effective!
Moreover, Node's cluster module makes cloning on a single server ideally easy to implement!
And this is “How to scale your Node.js app”! See? You have not just one, but several Node.js built-in tools at hand and various strategies to choose from, depending on your scaling needs.
Which scaling solution suits you/your app project best?
RADU SIMILEANU / May 03'2018
Have no fear... Node.js 10 is here (since April 24, actually)! And, as expected, this version is planned to grow into the platform's official Long Term Support version (in October 2018); to be supported for 3 years after that date.
So? What's in it for you, the back-end web developer?
Are there any new features and improvements worth getting really excited about? Which are they and how precisely will they improve the overall developer experience.
Now before we take a deep dive into the “steamy fresh load” of new features, I feel like pointing out that:
it's mostly incremental improvements, applied throughout the entire codebase of the platform, that Node.js 10 ships with
… performance, reliability and stability-centered improvements, bubbling up to the back-end developer's experience
But let's name these improvements that ship with the new version of Node.js. Let's talk specific incremental changes, shall we?
10 of the “really worth getting excited about” ones:
1. Error-Handling Improvements
And error messages/error-handling improvements do make the majority of semver-major commits (approx. 300) that Node.js ships with.
It's a “pledge” made since Node.js 8.0.0 to assign static error codes to all Error objects:
“Error messages should be useful, more consistent and predictable”, this has been the “pledge” driving all the sustained efforts geared at improving error-handling.
Note: error codes have been included in Node.js 10, making constant error-checking conveniently easier!
2. Enhanced JavaScript Language Capabilities
There's an entire list of Node.js 10 language improvements (you can find them all here) worth exploring and... exploiting, I'll outline the highlights only:
you now get to use line and paragraph separator symbols (U+2028 and U+2029) in string literals, that match JSON
V8 “introduces”: String.prototype.trimEnd(), String.prototype.trim(), String.prototype.trimStart()
prototype.toString() returns the exact “pieces” of the source code text (comments and whitespace here included!)
the catch clause of the try statements no longer calls for a parameter
3. The Node.js fs (file system) Has Been Significantly Overhauled
And here are the most “dramatic” improvements made during this overhaul:
the type checking and error handling have been improved
the code got restructured, for easier maintainability
a new experimental fs/promises API got implemented, featuring first-class Promise-based API
Speaking of this new API, its “role” is that of generating a warning at runtime, the very first time that it gets used. Hopefully, things will turn out “bugs-free” so that it can grow from experimental to stable.
4. Node.js 10 Ships with Full Support for N-API
N-API — the ABI stable (Node.js) API for native modules — has leveled up to a stable version in Node.js 10.
What does this mean?
it provides a stable module API, one that is not influenced by the changes in Node.js's V8 JavaScript engine
the API layer makes upgrading a whole lot easier, streamlining production deployments and... easing module maintainers' lives
… and it goes without saying that this bubbles up to native modules' maintenance costs, as well
In short: say goodbye to module breakage!
5. The Assert Module: Explore Some Crucial Improvements
All efforts targetting the assert module have been aimed at easing the internal implementation and improving the developer experience.
But let me point out some of these improvements that eventually fleshed out and landed in Node.js 10:
a new “diff” view got implemented, for whenever assertion errors get generated
overall the output becomes more descriptive, more... “verbose” (and implicitly more useful)
better object comparisons
promises support
detailed error messages
6. Node.js 10 Ships With V8 6.6: Expect a Major Performance Boost
Get ready to “exploit” V8 6.6's range of performance improvements to their full potential! Along with its new set of JavaScript language features!
From them all, I can't but mention:
the async functions
the async generators
the promise execution
7. Cryptographic Support
Node.js 10 is the first version of the platform to include OpenSSL 1.x! And this can only translate into:
Enhanced protection for your priceless data!
Now, if I am to outline just 2 of the OpenSSL features to look forward tapping into, I should definitely mention:
the Polu1305 authenticator
the ChaCha 20 cipher
8. The Trace Events Mechanism: Monitoring Your Code's Performance Just Got Easier
That's right! Keeping a close eye on how your code's performing and being able to quickly diagnose any emerging issues is easier than ever with Node.js 10!
Basically, what these trace events do is enabling that all the diagnostic information output gets collected to a file accessible to the Chrome browsers DevTools utility.
No need to use a command-line flag anymore to trigger this whole trace events mechanism underlying Node.js.
And since we're here, let me point out to you 2 trace events-related improvements worth getting (really) excited about:
the node.perf.usertiming category got added — its role is that of capturing, in the trace events timelines, all the Performance API user timer marks and measures.
the JavaScript API got implemented, as well; enabling/disabling trace events dynamically is now possible in Node.js:
const trace_events = require('trace_events')
const tracing = trace_events.createTracing({
categories: ['node.async_hooks', 'v8']
})
tracing.enable()
// do stuff
tracing.disable()
9. HTTP and HTTP/2 Improvements
Another thing to get excited about, when it comes to Node.js 10's release, is given by all the incremental improvements made to HTTP and HTTP/2.
Let me detail a bit:
when it comes to HTTP, the changes applied range from improved Streams API compatibility to stricter standards support, to improve header and error handling
now when it comes to HTTP/2, significant progress has been made for getting it the closest to “stable mode” as possible before Node.js 10 reaches its Long Terms Support cycle. And I'm talking here about improvements made to the way trailing headers requests and responses get implemented and about overall improvements of the internal implementation and the public API
10. Node.js Ships With The Experimental Node-ChakraCore
And how does this impact the developer experience? Your experience?
using the JavaScript engine to its full potential
tapping into the Time Travel debugging
… gets a whole lot easier for you. You're practically enabled to detect errors way before they even get to “infest” your code.
The END! This is how our list of 10 Node.js 10 features worth getting (really) excited about looks like! Do explore them and start your preparations for moving over to this new version of Node.js before October!
RADU SIMILEANU / May 02'2018
Here you are now: your Angular 4 front-end app ready to... wow its users! “Almost ready” actually! For it still needs styling... And what better HTML and CSS framework to go for than Bootstrap, right? But how to use Bootstrap with Angular 4 more precisely?
How do you properly integrate it into your Angular 4 CLI project?
Great news: you have not just one, but 3 options at hand for adding it!
Let me get into details:
On Using Bootstrap in Your Front-End Development Process
Is there any need to list here the reasons why it's precisely Bootstrap that you're planning to implement into your Angular CLI project? Angular 4, to be more specific.
After all, it's the most popular framework for styling websites built in HTML, CSS and modern web & mobile JavaScript frameworks (like Angular here):
It's an open source, feature-rich framework that turns front-end development into a such a “breeze”. Basically, it empowers you to build responsive layouts without the need to be a CSS “expert”.
And now, let's break down further with the step-by-step “tutorial” on how to use Bootstrap with Angular 4:
Step 1: Create a New Angular Project Using Angular CLI
The very first step to take is obviously setting up a brand new project.
Use the Angular Command Line Interface to generate it.
But first, install it to on your system:
$ npm install -g @angular/cli
It's only then, once you've installed its NPM package, that you can go ahead and... generate your new project.
For doing this, just type the following command in your CLI:
$ ng new myproject
Next, feel free to change into that specific directory and to turn on the web server:
$ cd myproject
$ ng serve
“App works!” This is the message that you should be seeing in your browser right now.
Step 2: Install Bootstrap to Your Project
Now that you've launched your new Angular project, it's time to add your Bootstrap library, as well.
And you sure aren't nickel and dimed in options. There are 4 ways to add Bootstrap to Angular 4.
Step 3: How to Use Bootstrap with Angular 4 — 3 Different Ways to Integrate It
Option 1: Install Bootstrap from CDN
And there are 2 particular files that you'll need to install from CDN into your project:
the Bootstrap CCS file
the Bootstrap JavaScript file
Note: keep in mind to add the jQuery JavaScript library file, as well!
Next, open the src/index.html file and insert the following:
the <link> element to add the Bootstrap CSS file at the end of the head section
a <script> element for adding jQuery at the bottom of the body section
a <script> element for inserting the Bootstrap JS file at the bottom of the body section
Eager to see “Bootstrap in action” in one of your project's component templates? Then give it a try:
open the src/app/app.component.html
enter the following code there:
<div class="container">
<div class="jumbotron">
<h1>Welcome</h1>
<h2>Angular & Bootstrap Demo</h2>
</div>
<div class="panel panel-primary">
<div class="panel-heading">Status</div>
<div class="panel-body">
<h3>{{title}}</h3>
</div>
</div>
</div>
And it's the following message that this HTML template code should trigger in your browser:
“app works!”
Note: go for a Bootstrap theme of your choice; once you've downloaded it (from Bootswatch.com for instance), its bootstrap.min.css file will get instantly opened up in your browser.
Just copy the file's URL and use it to replace the string assigned to the href attribute of the <link> element, in the index.html file.
And voila! It's precisely those colors, defined by your chosen theme, that get displayed in the browser now!
Option 2: Install Bootstrap using NPM
And here's another valid answer to your “How to use Bootstrap with Angular 4” dilemma!
Simply enter:
$ npm install bootstrap@3 jquery –save
It's this command that will integrate Bootstrap and jQuery into the node_modules folder of your Angular 4 project directory.
Moreover, it will include these 2 dependencies in the package.json file, as well.
Once properly installed, you can find both packages at:
node_modules/bootstrap/dist/css/bootstrap.min.css
node_modules/bootstrap/dist/js/bootstrap.min.js
node_modules/jquery/dist/jquery.min.js
Note! You have 2 options for integrating those files into your Angular 4 project:
add the file paths to the script array and to the file path of the angular-cli.json file
add the corresponding <script> and <link> elements to your index.html file
Option 3: Add NG-Bootstrap to Your Project
The great thing about this method is that you'll no longer need to add jQuery and Bootstrap dependencies. Ng-Bootstrap comes packed with a set of built-in native Angular directives which are already CSS and Bootstrap's markup-based.
Now, getting back to our initial “How to use Bootstrap with Angular 4” question, let's see how we install this NPM package.
For this, just enter the following command in your Angular 4 project directory:
npm install --save @ng-bootstrap/ng-bootstrap
Next, make sure you also install Bootstrap 4 to your project:
$ npm install bootstrap@4.0.0-alpha.6
And, the final step is to add the following files:
jquery.min.js
bootstrap.min.js
bootstrap.min.css
… to your .angular-cli.json file
Now you still need to import the Ng-Bootstrap’s core module — NgbModule — from its @ng-bootstrap/ng-bootstrap package.
To do this, just type the following import statement into app.module.ts:
import {NgbModule} from '@ng-bootstrap/ng-bootstrap';
All there's left for you to do now is to add the NgbModule to the @NgModuledecorator's imports array.
And since we're here, you'll find some more than “enlightening” info (chunks of code here included!) on the 2 different options at hand for importing the NGBModule:
either in your project's child modules
or in your the root module itself
… in this article here on Using Bootstrap with Angular.
Using The NG-Bootstrap Components: Which Are They?
With the NgbModule installed into your Angular 4 project, you're now able to use the Ng-Bootstrap components.
To leverage them in your app.component.html.
Speaking of which, here are the components at hand:
Accordion
Alert
Rating
Tabs
Carousel
Progressbar
Collapse
Datepicker
Buttons
Pagination
Typeahead
Popover
Timepicker
Dropdown
Modal
Tooltip
The END! Does this answer your “How to Use Bootstrap with Angular 4” question?
Which method of adding this front-end framework to your project is more suitable for you?
Silviu Serdaru / Apr 30'2018
Informative, entertaining, engaging and... a key revenue source! These are just some of your expectations regarding your Magento 2 blog, right? Well, then, get ready to check them all off your “wishlist” digging through my shortlist of can't-believe-its-free Magento 2 blog extensions.
From SEO-oriented to shipping-focused features, from functionalities centered on social media to those geared at enhancing page loading speed, these 9 extensions are, each, extra functionalities to inject into your blog.
So that it (your blog) should serve your specific needs and help you reach your goals. And that without having to “stretch” your budget (there are only 100% free extensions in this list)...
Oh, yes: and they're all wearing the signatures of certified Magento partners!
And now, let's get straightaway to these must-have Magento 2 extensions that you should be turbocharging your blog with:
all of them “spoiling” you with configurations that make customization unexpectedly easy
… blending perfectly into your blog's design and fitting into your codebase (no need to depend on an “army” of coding experts)
1. Magento 2 Image Slider
Let's review a visual/aesthetics-oriented extension first things first.
For, as above-mentioned, a “money-making” blog shouldn't be purely informative and helpful, but... engaging, visually-arresting, as well.
So, imagery does play its major part here!
Now here are a few of this extension's key features:
supports no less than 10 sliders
built-in support for inserting video text, image
one of those fully responsive free Magento 2 blog extensions
provides tons of animations, with Live Preview, for you to select from
supports OWL Carousel
conveniently intuitive UI
you're free to display it anywhere on your blog with CMS & Widget
2. Facebook Live Chat
A blog is the ultimate channel of communication with your brand's audience. With your e-store's regular and potential customers.
Well, then moving from standard communication to... instant communication is a must if you want to meet their expectations. And this is what makes Facebook Live Chat one of the must-have free Magento 2 blog extensions.
It's that chatbox incorporated into your blog that's powerful enough to turn “just” guests into loyal customers.
And now, let me point out to you some of its most powerful features:
there's a Like button and a store profile incorporated into the chatbox
user statistics capabilities
unlimited History Chat
you get to set upcoming events, define greeting text and integrate your e-store's Facebook profile into the chatbox
simple backend operations for enabling/disabling the chatbox displayed on your blog
familiar UI; a Facebook Messenger Interface-alike chatbox
3. Magento 2 Lazy Load
A must-have extension for your Magento 2 blog if you care enough about the user experience that you provide there. And page loading speed does play a key role in improving/negatively impacting it.
Moreover, besides optimizing your blog's performance, Magento 2 lazy load creates some aesthetically-pleasing image transitions influencing the UX.
But let's get deeper into details and “unearth” all those advanced features that make this extension one of the must-haves:
it helps you save your web server resources — saves bandwidth and minimizes server requests
it creates smooth, blurring effect transitions for your lazy load images
… and a smooth, visually-pleasing transition when users keep scrolling down your pages
it gives your blog a ranking boost by creating friendly code strings
it optimizes your blog's page loading time
you're free to enable/disable the “Lazy Load” mode for each one of your blog's pages
you get to set advanced time point for loading pages
4. Better SEO, One of the Free Magento 2 Blog Extensions You Should Be Using
Inject Better SEO into your blog and... propel it in the search engines results!
And it's not “just” packed with clever features, but ideally easy to use, as well. Built to fit into your blog's existing code structure and to empower you to customize it to serve your SEO goals in detail.
I'm talking here about:
meta descriptions
meta keywords
… that this extension's flexible enough to allow you to insert quick and easy.
Now that we've settled that Better SEO makes an ideally customizable, blog/store-friendly extension, let's check out its powerful features:
SEO checklist — a more than handy “TO Do” list, pointing out to you the SEO tasks to complete for reaching a high SEO score
its detects duplicate content issues
advanced HTML/XML sitemaps — one for the users, the other one to be used by search engines
structured data — implements schema structured data
metadata template rules — easy to define mass and dynamic metadata for your pages, categories, layered navigation
provides you with actionable SEO reports
rich snippets preview
cross links
social optimization
5. Exto Analytics
Applying a marketing strategy that lacks the proper data-fuel is like aiming at a target... blindfolded.
So, if relying on pure chance doesn't define you and if you want to go beyond the data provided to you by the native Magento 2 reporting functions, go with Exto Analytics.
Here are some more heavy-weighing reasons to do so:
real-time mobile dashboard, so you should remain “connected to” your data anytime anywhere
convenience at its best when it comes to handling your reports — you get to sort data by specific columns and even to turn off the columns feature itself
date range picker — compare and evaluate your blog's performance on different periods of time
your previous data gets added to your reports, as well, once you install the extension
a chart, enabling you to visualize all data reports in parallel
6. Magento 2 Admin Theme
From user experience to... admin experience.
As your own blog's admin, you should also consider making your dashboard's more user-friendly and intuitive.
For a high level of convenience on your side will bubble up, eventually, in the experiences that you'll create for your visitors.
But let's see specifically what makes Admin Theme one of the best Magento 2 blog extensions to use:
mobile optimized
easy to use and quick to customize
retina ready
clean, neatly structured code
a different interface for Login & Forgot Password
admin icon font
translation-ready
7. Magento 2 Infinite Scroll
It does precisely what its name says: it keeps loading content, without interruption, as your blog guests scroll down.
Fluidity in the way you present content to your readers translates into improved user experience!
And now, let's scan through this extension's specific features:
you can display and easily change the “Show” button, along with its loading text
the navigation bar can be placed anywhere on the page
you can implement it both on your category page and in the search page
the pages that your readers land on get automatically loaded
while scrolling down, your blog guests know, at all time, what section on the blog they're on
you get to customize your progress bar to your liking
users get to share the links of those specific pages that they reach during their scrolling (for instance, if they're on page 8 of your blog, they can bookmark/share the link of precisely that page)
8. Better Blog
Now, let's imagine that you don't own a blog yet, “only” an e-store.
And that now you want to integrate a simple blog, as well. One that should:
be conveniently easy to configure
have a beautiful layout design to “wow” your readers with
load fast
come packed with much-needed backend features, making updating content unexpectedly easy for you, the admin
Checked, checked, checked!
The Better Blog is undoubtedly one of the must-go-to Magento 2 extensions no matter the size of your current e-commerce site.
Once integrated into your Magento store's backend, you'll get to manage both your store and your blog from the very same place.
Here are the main reasons why it still is one of the best Magento 2 blog extensions:
SEO friendly: SEO-friendly URLs, metadata information, XML sitemap
open source code
layered navigation, with a significant impact on UX (your blog guests get to quickly track precisely those posts that they're looking for
out-of-the-box comment functionality: Disqus Comment, Facebook Comment
blog topics
built-in product recommendations feature: "Who Bought This Also Bought", "Auto Related Products", "Frequently Bought Together"
the option to integrate your store or your blog's sitemap
responsive design
social sharing buttons
blog widgets: show your (recent) posts on your site's homepage (sidebar here included)
In short: you get to integrate a simple blog with your e-store with no need for a third-party framework!
Moreover, you'll be managing comments, categories, posts, right from your Magento 2 admin, quick and easy.
And you'll get informed each time when a blog guest has posted a comment, not to mention that the extension grows into a powerful “ally”, supporting your SEO efforts.
One of the must-have Magento 2 extensions without question!
9. Magento 2 SMTP
A powerful extension to “turbocharge” your Magento 2 blog with so you:
gain total control over your email customization process
get enabled to run test sections on your Magento 2 SMTP server
And it does all that by providing your blog with configurable port and host.
Now, let's go through its cool features:
it stores all sent emails logs
built to support 20+ SMTP service providers
enables you to test how well your current email setting's doing
it empowers you to customize your emails in the slightest detail
The END! These are the 9 best Magento 2 blog extensions that you should be using. Scan them through, “weigh” their feature loads while setting them against your own needs and growth plans for your blog and... go for the most suitable ones!
Adriana Cacoveanu / Apr 27'2018