Content is a way too valuable asset not to handle it with utmost care — from its creation to its revision, all the way to its... distribution. And with utmost efficiency, as well! But how do you choose the business software to “orchestrate” your entire content workflow? Since, on one hand, you have the top enterprise content management systems in 2018 and, on the other hand, you have... Drupal?
And the dilemma that you're facing right now could be summed up like this:
Choosing between a complex ECM system with a load of powerful tools that comes at a cost and a feature-rich one — already famed for its robustness and customization options — with no price tag on...
Now to ease your decision-making process, let's compare these enterprise information management solutions, the top rated ones, to Drupal, by weighing their feature loads and costs.
1. But What Is an Enterprise Content Management System More Precisely?
First, let's try to define what we mean by “content” in relation to a content management software:
Content is all the written pieces of information entering and “moving about” your organization. It comes in the form of:
internal process documents
content for your company website (or blog)
sales-focused content
targeted, custom content available to paying cutomers only
... and the list goes on.
As you can see, I've intentionally left out graphical and audio-visual content. And this because it's only text-based digital content that a CMS would handle.
Now, coming back to our initial question:
An enterprise content management system is a software geared at managing all the processes in your content's lyfecycle: creation, revision, publication, distribution to multiple channels, promotion etc.
Packed with different sets of tools designed to automate all your content-based processes, an ECM system is a... “Swiss knife” type of business software.
The one you'd use to streamline your content workflow(s).
2. M-Files, One of the Top Enterprise Content Management Systems in 2018
Introducing the enterprise-leveled information management solution of the year: M-files!
The promise that it makes?
To break the “siloed information” pattern and enable users to access specific content from any buiness system, any device.
… to easily access it, but also to organize it, to manage it, to identify particular information/documents, to set up custom workflows and even to manage document reviews.
Top features
version control
automated workflows
pre-built search engine: you get to track documents by type, name, keywords; it provides within-text search features as well
notifications: users get alerted whenever they'll need to review or approve changes made to documents
approval processing
permission management and offline access
integration capabilities: it easily integrates with Microsoft Dynamics, NetSuite, SAP, Salesforce
document collaboration tools: co-authoring features and check-in/check-out tools
Price
Mi-files is one of those enterprise content management vendors that leverage the quote-based method for pricing their services.
Basically, there are no standard prices, as there are no standard packages that they offer, only tailored content management solutions.
Cons
The great majority of negative user feedbacks revolve around the M-Files mobile app's limited functionality.
2. OnBase
Another one of the top enterprise content management systems in 2018 is OnBase:
An all-in-one software coming “equipped” with:
business process management tools
integrated document management tools
records management tools
And before I “expose” to you its most heavy-weighing features, I feel that I should put the spotlight on its versatility feature first:
You get to easily configure your OnBase ECM system to fit any environment of choice.
Top Features
approval process control
indexing
version control
built-in search engine
document management
Cons
Do expect a steep learning curve! So, be prepared to invest a significant amount of time in growing comfortable with using it.
In learning to “juggle” with all its apps and functionalities.
Price
You'll need to contact the OnBase team for a custom pricing plan.
3. Box
Box is a cloud content management platform built to assist you with:
online sharing your files
storing your files
integrating content across your entire “infrastructure” of digital tools via open APIs
collaborating within your team
Top Features
granular access permission
easy integration with other platforms
advanced security capabilities: device trust, watermarking, data governance
easy integration with other platforms
collaboration tools: a document management system that enhances collaboration among end-users on various file types and devices; tools which also enable them to choose the right storage place, to set up metadata-driven content workflows etc.
Cons
Even top enterprise content management systems manage to collect their own “pile” of “bad reviews”. What users reproach OnBase here, for instance, is its user-based pricing model.
In other words, if you have +100 people in your company, expect to get charged separately for each email domain... and thus to overstretch your budget over time.
Price
Box pricing plans start from €4.50 per user/month (we're talking about a starter business plan here) and can go up to $500 per month or more if it's a “build with BOX platform” plan that you'll select.
4. Drupal
And now that we've put the top-rated ECM systems in 2018 into the spotlight, let's see what Drupal here has to offer. How it can counterbalance all these heavy loads of tools, features, and functionalities.
Drupal's Key Features
advanced integration capabilities: Drupal “spoils” its end-users with conveniently accessible API, backed by a rich collection of modules built precisely for 3rd party integrations
no maintenance effort required: since it runs in Acquia Enterprise cloud, Drupal gets automatically updated; maintenance is already included in the Enterprise support costs plan
feature richness: and we're talking here about features, plug-ins (thousands of them) and content management tools that you get right out of the box
modular architecture: which goes hand in hand with the unlimited freedom of customization that you'll get to leverage
high performance: Drupal's already famed for its robustness and capabilities to withstand high influxes of traffic
unmatched scalability
a full toolbox (contributed modules here included) put at editors' disposal: Drupal's also won its reputation as a CMS that's been constantly improved to enrich the experience; all the in-built content-handling tools speak best of its “empower the content creator/end-user” philosophy
Price
license costs: unlike the top enterprise content management systems previously outlined, Drupal's open source; there are no license costs, only support costs associated with the Acquia Enterprise Platform
vendor lock-in: all modules and plug-ins that you might select and mix and match to custom-tune your CMS are free
development costs: Drupal resources are available to anyone who wants to build and then to custom tune and scale up its CMS
In conclusion...
… Drupal comes feature-packed and, moreover, it “spoils” you with unlimited freedom of customization. And all this without putting a price tag on.
On the other hand, some of the top enterprise content management systems do tempt you with their feature richness, but at a cost. One that can go up precisely if you feel like customizing your ECM solution or scaling it up sometime in the future.
In short: you do get your share of customization freedom... but not for free.
So, it's not really an “apples vs oranges” type of dilemma that you're facing, but rather an:
Apples vs Apples with a price tag on
Adriana Cacoveanu / May 29'2018
About to build your very first Angular app? Then you must be planning to create an Angular project with Angular CLI, right? The much-acclaimed tool that the Angular team built precisely to jumpstart the whole development process.
… to have a simple app scaffolded, generated and deployed in no time, by entering just a few commands (so they say, at least).
And since I'm sure that you don't want to waste these bundles of convenience by letting yourself tangled up in overly complex explanations instead, I've kept things simple with this guide.
So, here's how you create and run your first Angular project via the Angular command-line interface:
1. But How Precisely Can Angular CLI Jumpstart Your App's Development Process?
Take this command-line interface as a starter kit “present” that the Angular team has nicely wrapped for you:
it's practically geared at empowering you to get up and start developing a new Angular app in no time
it takes just one short command to generate a default Angular project that would include all the needed dependencies (in the node_modules folder), as well as testing files for each component
Now, this is what I call a major kickstart! It's got you covered from the stage of setting everything up, to creating the Angular project itself, to testing it and finally deploying it.
Other key usages of the Angular CLI that we're not going to focus on in this tutorial here are:
real-time server maintenance and support
building applications for production
adding new features to your existing app
running tests on your application units
2. Setting It Up: Install Angular CLI Globally
Before you jump to the part where you create an Angular app using Angular CLI, you need to install the command-line interface itself. Globally!
And this is the “power” command to enter in your terminal with the npm installed:
npm install -g @angular/cli
Notes:
if you already have the CLI installed, make sure it's the latest version
if you need to update it, these are the commands to enter:
npm uninstall -g angular-cli
npm uninstall --save-dev angular-cli
And there's more! A few more must-have dependencies that you need to make sure that are already installed and upgraded to their latest versions:
Node.js: v6.9.x +
npm: 3.x.x +
3. Create an Angular Project With Angular CLI
With your command line interface ON, use it to enter THE one and only command that will generate a new Angular project for you. One incorporating, by default, all the needed dependencies:
ng new ng-yourproject
Tada! A “yourproject” named directory has just been generated. That's where your new Angular project — along with all the requested dependencies — gets stored.
Eager to test it out? Just run the following command in your terminal:
ng serve
Your Angular app will then get built and served up to localhost:4200. Feel free to open this URL in your browser and it's the here-below screen that you should be able to see:
Basically, it's the default application shell itself rendered by the CLI.
4. “Deconstructing” Your New Angular Project: What Does It Include?
Now, before you go ahead and do your “tweaking” on your newly created app, let's see what you've got in there! What are the elements that the CLI has generated for you to help you jump to development right out of the box?
For this quick “scan”, open your Angular project in your IDE of choice and start “exploring” your src/folder:
src/*
styles.css
any styles that you'll plan to apply globally, it's this file that you can add them to; and it's here, as well, that you can import new .css files (Bootstrap or any other styling frameworks of your choice)
index.html
where your Angular app gets started
src/app/*
app.component.ts
this is where your app's one and only (for now at least) component gets stored
app.module.ts
the modules Angular needs for managing your app's components
note: @NgModule marks the class file as a module and this is what makes it similar to @Component
5. Create a New Component
Remember your “one and only component” that I mentioned during the previous “inventory” of all the “bunch of stuff” that CLI has generated in your project?
Well, how about creating a new one now? One that would load under that root component?
Just run the following command to generate it:
ng generate component the-quote
Next, time to “show it off” in your browser:
<h3>{{myQuote.quote}}</h3>
<small>- {{myQuote.by}}</small>
Add the app-the-quote selector to the root component that the CLI generated in your Angular project:
<h1>
{{title}}
</h1>
<app-the-quote></app-the-quote>
6. Apply External Styling
Now you do agree that when you create an Angular project with Angular CLI applying styling is a key step.
So, let's add your favorite CSS framework to your new application!
Me, it's Bulma that I'll be using in this tutorial here:
npm install bulma --save
With our CSS framework installed, we'll need to enable it to load the CSS file into our Angular app. For this, just place the relative path within the .angular-cli.json file., to the file in the styles array more precisely.
...
"styles": [
"../node_modules/bulma/css/bulma.css",
"styles.css"
],
...
“Tempted” to display some icons now, as well? Then go ahead and add the font-awesome library as cdn link.
For this, just include the stylesheet right into your index.html:
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/font-awesome/4.7.0/css/font-awesome.min.css">
Et voila! This is how you create an Angular project with Angular CLI!
What do you think? Is the Angular command-line interface an extremely useful tool to jumpstart your project with or did you expected your “starter kit” to include, right out-of-the-box, more elements to get you started?
RADU SIMILEANU / May 25'2018
What can you do to speed up your Magento 2 store on mobile devices?
For let's face it: Magento 2's “ecosystem” of third-party extensions and overall its modular architecture is convenience at its best for any developer! For any eStore owner. It empowers them both to start small and then scale it up to their most daring goals. Yet, all this power placed in your hand does come at a cost: reduced speed.
And top speed is crucial if you're determined to deliver a great mobile user experience.
So, what are the tweaks that you can make to boost your eStore's performance?
Luckily, there are plenty, ranging from:
well-known (and too frequently underrated) best practices, like optimizing your product images
to slightly more in-depth “tweaks”, like inlining critical CSS
But, let's dive right in! Here's your “emergency kit“ of 5 solutions to apply to your Magento 2 store for improving its performance on mobile:
1. Reduce Page Size to Increase Page Loading Speed
And it's still those “too obvious to be truly effective” type of techniques that have the biggest impact on an eStore's performance on mobile devices:
Lower your web page size and it will make a drastic difference for your mobile users' experience with your online store; especially for those accessing your site from mobile devices with low bandwidth.
Now here are a few simple, yet effective tweaks that you can make to reduce page size:
1.1. Use GZIP to Compress Your Pages
A handy “trick” that you can perform is to enable GZIP (if it's not already enabled) and let it “work its magic” on your web page's size.
It will compress:
fonts
CSS files
external scripts
JS
… cutting your pages' “weight” down by almost 70%.
Note: put any of your front-end pages to the Google PageSpeed Insights “test”; take note of the GZIP-related warnings popping up and ensure that the CSS/JS compression feature is enabled.
1.2. Enable JavaScript/CSS Files Minification
Here's another built-in Magento 2 feature that all you need to do is... trigger to speed up your Magento 2 store on mobile devices: CSS/JS files minification.
Note: do keep in mind, though, that it works in production mode only (so not in default or developer mode, as well)!
Here's how you enable it:
Navigate to the backend menu
Stores > Configuration > Advanced > Developer
Set your app/site's production mode:
php bin/magento deploy:mode:set production
Note: not sure what mode your eCommerce site's running on now? Enter the following command to identify its current mode:
php bin/magento deploy:mode:show
1.3. Optimize Your Product Pages
And the more crowded your product catalog is, the more important this solution becomes!
“Are you implying that I should take each and every one of my product images and optimize them... one by one?” you might ask yourself.
Definitely not! Since there are at least 2 easy solutions that you could go for:
you can use a content delivery network (CDN) as it will take the image optimization “burden” off your back
you can leverage the Google PageSpeed (GPS) server extension; it will compress your images in no time, among other “tricks” that it performs to speed up your Magento 2 store on mobile
2. Reduce The Server Response Time to Speed up Your Magento 2 Store
Optimizing your server's response time (or “time to first byte”) is another critical tweak that you can do to boost your Magento 2 store's speed.
Set your “target” to 0.5s, the time a browser would need to wait for your website's server response.
“But why bother, since Magento provides me with full-page cache functionality right out of the box”, you might wonder.
That's true, but just consider particular pages, such as checkout, customer pages, cart, that this pre-built functionality can't “work its magic” on.
2.1. Run a Throughout Audit on Your Third-Party Extension "Load"
Start reducing your server response time with a basic, yet so very effective step:
Audit your entire modules infrastructure!
identify any issues with your current plugins and (if any) look for a patch or replace them with more performant ones
turn them on and off just to detect any negative impacts on your Magento 2 site's performance
Note: as a rule of thumb, try keeping your Magento 2 third-party extensions to a minimum! Trim down your collection of modules keeping only the must-have ones; otherwise, its weight will affect your eCommerce site's performance!
2.2. Use Magento 2 Profiler to Detect Any Server Performance Issues
“What's a profile?” you ask.
A program geared at identifying just how much time a code block needs to execute.
Using a profile you'll be actually drilling deep(er) into your Magento 2 store's internals to detect the very root cause of your bad server response time!
2.3. Consider Upgrading Your Hosting Plan
Is it time you upgraded your hosting server? More RAM and CPU will always have a huge impact on your eCommerce website's performance, you know.
So, how do you know whether it's time to upgrade?
Just install a brand new Magento 2 website on your current server. If it's speedier than your live website, there's no need to change your current hosting plan.
In this case, you'll only need to focus on the other tweaks included in this list here to speed up your Magento 2 store on mobile.
2.4. Use Varnish as a Full-Page Cache (FPC) Solution
Another trick for improving Magento 2's performance is to leverage Varnish, the software that caches and serves content.
The good news is that Magento 2 supports it natively.
And here's how you trigger its “power”:
navigate to the backend menu
Stores > Configuration > Advanced > System > Full Page Cache
Note: you'll need to enter a hostname, port and Varnish export configuration; if in doubt, ask your system admin for a hand to set everything up properly.
3. Load and Render Above-the-Fold Content First
Prioritize the content that appears before scrolling down! It will make all the difference when it comes to your Magento 2 eStore's page loading time!
And now, here are the techniques at hand for loading and displaying this content first:
3.1. Defer Loading JavaScript
Moving all your JS code to the bottom of the page (“beneath the fold”) will implicitly make your AF (above-the-fold) content load quicker.
You'll basically postpone the time-consuming parsing of JS code and thus speed up your Magento 2 store on all mobile devices!
The good news is that there already are Magento 2 extensions that do the job for you. They'll move all your non-critical JS scripts beneath the fold!
3.2. Inline Critical Above-the-Fold CSS
“But what about the above-the-fold CSS?” you might legitimately ask yourself.
How do you approach these critical files? For you definitely can't place ALL your CSS at the bottom of the page, now can you?
Well, first you:
extract/isolate precisely this “critical” CSS
then you inline it straight to the HTML; right between <head> and </head> tags
This way, it will get loaded and rendered first (before the non-critical CSS), along with the rest of the above-the-fold content.
Note: you might be tempted to go for one of those tools “promising” you to extract this CSS for you. Unfortunately for you, manually setting the critical CSS for each one of your pages (homepage, checkout, category etc.) is the right way to do it.
4. Leverage the Power of HTTP/2
By moving your Magento 2 website over to HTTP/2 you'll grant your eStore users a secure and faster-browsing experience.
Not to mention the impact that it will have particularly on the experiences of those customers using a slow mobile network to access your online store.
The tremendous news is that Magento 2 co-exist with HTTP/2 by default. Yet, there are 2 conditions that you need to make sure your online store meets:
your hosting server should already support HTTP/2
your eCommerce web pages should be served via SSL
Note: run your own "investigation" and look for some suitable extensions providing server pushes.
5. Magento 2 Performance Optimization: Disable JS Bundling
But why would you want to disable a Magento 2 feature that actually lowers the HTTP requests made by your browser for loading and rendering a web page?
Because it comes with its own side-effects, the main one being the oversized JS file that this feature generates, of about 5-10 Mb.
Moreover, it's proven that downloading this huge external file takes more time than the time you'd actually be saving by reducing the no. of HTTP requests.
Now that we've tackled the “Why”, let's focus on the “How”, as well. Here's how you disable JS bundling:
go to your website's backend menu
Stores > Configuration > Advanced > Developer
and apply the following configuration:
Note: there's no need to disable this JS files grouping feature if you're already using HTTP/2!
The END! These are but 5 of the handiest solutions that you could use to speed up your Magento 2 store on mobile. As you can see, the list includes nothing more than predictable “tweaks” and well-known best practices that you should stick to.
Silviu Serdaru / May 23'2018
It's undebatable: Node.js has practically laid the foundation of the real-time web! The real-time, two-way connection web apps have revolutionized the old web response paradigm. The one where it was just the client who could initiate communication, never the server, as well. Even so, there are certain cases when using Node.js is not the best decision you could make.
Specific use cases for which the otherwise flexible and revolutionary web server technology turns out not to be... unsuitable. So:
“When shouldn't I use Node.js?”
You might legitimately ask yourself.
Here are the 3 bad use cases for this JavaScript runtime environment. Scan them through, take note all the factors that I'll be outlining and think them through before rushing to use Node.js to power your next project with.
1. A CPU-Heavy Application: Using Node.js Is Simply a Bad Idea
Face it, deal with it and... adjust your decisions to it:
There are plenty of better solutions (other than Node.js) for powering your CPU-intensive app. It's just not the best technology at hand when it comes to heavy computation.
Now here's why, by using Node.js, you'll only end up “sabotaging” its very advantages, instead of turning it into a true “horsepower” for your app, as you might expect:
Node.js leverages an event-based, non-blocking I/O model, using a single CPU
hence, all that intense CPU-processing activity will actually block the incoming requests
… since the thread will get “held up” with number-crunching
The direct effect of “exploiting” Node.js in the context of heavy server-side computation?
The very benefits of its event-driven, non-clocking I/O model would get practically... nullified in the context of CPU-intensive operations.
Given this, why would you stubbornly stick to Node.js, when there are other technologies more suitable for building your CPU-heavy software with? With better results?
2. A Simple CRUD (or HTML) Application
No need to get your hopes high when using Node.js with a basic CRUD or HTML application:
It might turn out to be just “slightly” more scalable, yet don't expect a traffic flood just because it's Node.js-powered.
In short: use cases like this one, where data's provided, straightforwardly, by the server and where there's no need for a separate API, render Node.js superfluous.
There are other frameworks suited specifically for this type of projects (take Ruby for instance).
Using Node.js in this case would be like driving a Formula 1 car while... stuck in rush hour traffic.
3. A Relational Database-Backed Server-Side App
Why isn't Node.js your best choice for a relational data access type of project?
Because its relational database tools aren't as reliable, robust and easy to work with as other frameworks' toolboxes (take Rails for instance!).
Rails, for example, would “spoil” you with:
already matured Data Mapper and Active Record data access layer implementations
out-of-the-box data access setup
DB schema migrations support tools
… and the list goes on
In short: if there already are frameworks perfectly “equipped” for this type of project “scenarios”, why would you stick to using Node.js? Since its relational DB toolbox is not (yet) as advanced?
In Other Words...
With these 3 “bad” use cases for Node.js “exposed”, allow me to put together a short “inventory” here, one including all the “gotchas”, aspects to consider before kicking off your Node.js project and limitations to be aware of:
Node.js hasn't been built with the “solving the compute scaling” issue in mind
… but it has been created to solve the I/O scaling issue instead
excepting contexts of CPU-heavy operations, Node.js still is the best technology at hand for powering your real-time, scalable web apps with
do reconsider your decision of using Node.js if for developing your piece of software you depend on some kind of threading model
there are also poor quality packages available in npm, free to use in your Node.js application; do keep this in mind as you dig deep into the “load” of Node.js packages
Node.js will never be “the best choice” for event loop-blocking use cases (take asynchronous parsing XML, for instance)
… nor for powering apps relying on intense computation
Node'js “worker” is geared at solving HTTP server calling issues (rather than intense computing issues)
The END!
RADU SIMILEANU / May 17'2018
Whether it's the increasingly challenging workload or you simply want to enhance your Node.js app's tolerance to failure and availability, there comes a time when you just need to scale it up, right? To “squeeze” the best performance out of your entire infrastructure of... nodes. Well then, here's how to scale your Node.js app:
And scaling up your web back-end app at different levels — overall improving its throughout — sure isn't an afterthought with Node.js:
Scalability is built in the very core of the runtime.
And the infrastructure of nodes, strategically distributed, communicating with each other, is what makes this framework particularly scalable.
So, what is the best way to scale up your Node.js app?
Which are the most powerful built-in tools for scalability to explore and to “exploit”? And what are the best strategies to go for depending on your specific scenario and scalable architecture needs?
Horizontally Scaling Your Node.js App
Horizontal scaling comes down to... duplicating:
Basically, you duplicate your application instance, enabling it to “cope with” a larger number of incoming connections.
Note: you can horizontally scale your Node.js app either across different machines or on a single multi-core machine.
A word of caution: do keep in mind, though, that this scaling solution might add up unnecessary complexity to your app's infrastructure; it might entail the need to provision and to maintain a load balancer, might make troubleshooting more challenging, and even change the way you deploy your app.
That being said: make sure that it's specifically this Node.js scaling solution that your project needs before you go ahead and implement it!
Vertical Scaling
If your scalability architecture needs involve nothing more than:
injecting more power
adding more memory
… with no particular “tweaking” applied to the code, then vertical scaling might just be the right answer to the “how to scale your Node.js app” dilemma.
Here's why:
by default, Node won't use more than 1.76GB of 64-bit machines' memory
in case of a 32GB of RAM machine, for instance, the Node process will limit itself to only a fraction of its memory
Have Multiple Processes Running on The Same Machine
Here's another possible answer to your “How to Scale your Node.js app” question:
Have multiple processes running on the same port.
It goes without saying that this scaling solution calls for some kind of internal load balancing that would distribute the incoming connections across the entire ecosystem of cores/processes.
Word of caution!
Not sure whether there's any need to add this: keep the number of running processes lower than that of the cores!
Hereinafter, let's focus on 2 Node.js built-in tools for scalability that you might want to tap into:
The Cluster Module
Node's cluster module makes a great starter for scaling up your application on a single machine.
How does it work precisely?
It makes setting up child processes sharing server ports conveniently easy.
Practically, one “master” process will be in charge with spawning all the child processes (and there's one “worker” for each core), those that actually run your Node.js app.
Feel free to dig here into more details on the whole process.
Yet, there are certain limitations to this basic scaling solution:
in case one of your child processes “dies”, it doesn't... regenerate itself
you'll need to handle the master-worker processes difference... the “old school way”, using an “if-else” block
there's no way of modifying multiple processes, at once, on-the-fly!
Note: yet, when it comes to the “dead child processes” drawback, there's... hope. For instance, use this piece of code that would enable the master process to... respawn the “worker”:
cluster.on('exit', (worker, code, signal) => {
cluster.fork();
});
And voila! This drawback has been taken off your list!
The PM2 Cluster Module
Using the PM2 cluster module, “how to scale your Node.js app” dilemma turns into:
“Lay back and let the PM2... clusterfy your server for you!”
All you need to do is “trigger” this command's superpower:
pm2 start app.js -i 4 –name="api"
It will instantly create a 4-node cluster for you!
Now, here are some more details about what's going on “under the hood” during this process:
the PM2 daemon will take over the ex “master process'” role and spawn N processes (the former “worker processes”) while relying on round-robin balancing
moreover, if it's PM2 process manager that you're using, your process gets automatically scaled across all the existing cores (no need to trigger the cluster module for that anymore)
also, the same PM2 process manager will ensure that processes restart, instantly, if they happen to crash
You'll just need to write your Node.js app as if it were for single-core usage and the PM2 module will make sure that it gets scaled for multi-core.
Note: now if you want to scale your Node.js application further, you might want to consider deploying more machines...
Scaling Across Multiple Machines with Network Load Balancing
The underlying process is more than similar to the “multiple core scaling” one, if you come to think of it:
Instead of several cores, you'll have several machines; each one will be running one or more processes and will get “backed up” by a load balancer redirecting traffic to each machine in this infrastructure.
“And how does a network balancer work, more precisely?” you might ask yourself:
Once a request is sent to a node, the balancer sends the traffic to a specific process.
And there are 2 ways of deploying your internal balancer:
deploy a machine and set up a network balancer yourself, using NGINX
use a managed load balancer (like Elastic Load Balancer); setting it up is conveniently easy and it “spoils” you with all kinds of built-in features, such as auto-scaling
Now if your “How to scale your Node.js app” question turns into a “Isn't it risky to have just one point of failure for my infrastructure?":
Just deploy multiple load balancers instead of relying on a single balancer.
They would be all pointing to the same servers, needless to add.
Note: for distributing traffic across your “ecosystem” of internal balancers, you could just add several DNS “A” records to your main domain.
How to Scale Your Node.js App: 3 Scaling Strategies to Consider
1. Decomposing
“Microservice” is another word for this scaling strategy. For practically you'll be “juggling” with multiple microservices (although their size is of no significant importance, actually).
Or multiple applications, with different codebases (and in many cases, each one of them has its own UI and dedicated database).
And it's by services and functionalities that you'll be decomposing/scaling your Node.js app. A strategy that can lead to unexpected issues in the long run, but which, if implemented correctly, translates into clear gains for your apps' performance.
2. Splitting
Or “horizontal partitioning” or “sharding”, if you prefer. This strategy involves splitting your app into multiple instances, each one responsible for a single, specific part of your app's data!
Word of caution: data partitioning calls for a lookup before you carry out each operation; this way you'll identify the right instance of the application to be used.
Take this example here:
You might want to partition your Node.js app's users by language or area of interest. In this case, a lookup step is a must; you'll need to check that information, first things first.
3. Cloning
And this is the easiest strategy at hand for solving your “How to scale your Node.js app” dilemma!
Just clone your Node.js back-end application, multiple times, and assign a specific part of the workload to each cloned instance!
It's both effective and cost-effective!
Moreover, Node's cluster module makes cloning on a single server ideally easy to implement!
And this is “How to scale your Node.js app”! See? You have not just one, but several Node.js built-in tools at hand and various strategies to choose from, depending on your scaling needs.
Which scaling solution suits you/your app project best?
RADU SIMILEANU / May 03'2018
Here you are now: your Angular 4 front-end app ready to... wow its users! “Almost ready” actually! For it still needs styling... And what better HTML and CSS framework to go for than Bootstrap, right? But how to use Bootstrap with Angular 4 more precisely?
How do you properly integrate it into your Angular 4 CLI project?
Great news: you have not just one, but 3 options at hand for adding it!
Let me get into details:
On Using Bootstrap in Your Front-End Development Process
Is there any need to list here the reasons why it's precisely Bootstrap that you're planning to implement into your Angular CLI project? Angular 4, to be more specific.
After all, it's the most popular framework for styling websites built in HTML, CSS and modern web & mobile JavaScript frameworks (like Angular here):
It's an open source, feature-rich framework that turns front-end development into a such a “breeze”. Basically, it empowers you to build responsive layouts without the need to be a CSS “expert”.
And now, let's break down further with the step-by-step “tutorial” on how to use Bootstrap with Angular 4:
Step 1: Create a New Angular Project Using Angular CLI
The very first step to take is obviously setting up a brand new project.
Use the Angular Command Line Interface to generate it.
But first, install it to on your system:
$ npm install -g @angular/cli
It's only then, once you've installed its NPM package, that you can go ahead and... generate your new project.
For doing this, just type the following command in your CLI:
$ ng new myproject
Next, feel free to change into that specific directory and to turn on the web server:
$ cd myproject
$ ng serve
“App works!” This is the message that you should be seeing in your browser right now.
Step 2: Install Bootstrap to Your Project
Now that you've launched your new Angular project, it's time to add your Bootstrap library, as well.
And you sure aren't nickel and dimed in options. There are 4 ways to add Bootstrap to Angular 4.
Step 3: How to Use Bootstrap with Angular 4 — 3 Different Ways to Integrate It
Option 1: Install Bootstrap from CDN
And there are 2 particular files that you'll need to install from CDN into your project:
the Bootstrap CCS file
the Bootstrap JavaScript file
Note: keep in mind to add the jQuery JavaScript library file, as well!
Next, open the src/index.html file and insert the following:
the <link> element to add the Bootstrap CSS file at the end of the head section
a <script> element for adding jQuery at the bottom of the body section
a <script> element for inserting the Bootstrap JS file at the bottom of the body section
Eager to see “Bootstrap in action” in one of your project's component templates? Then give it a try:
open the src/app/app.component.html
enter the following code there:
<div class="container">
<div class="jumbotron">
<h1>Welcome</h1>
<h2>Angular & Bootstrap Demo</h2>
</div>
<div class="panel panel-primary">
<div class="panel-heading">Status</div>
<div class="panel-body">
<h3>{{title}}</h3>
</div>
</div>
</div>
And it's the following message that this HTML template code should trigger in your browser:
“app works!”
Note: go for a Bootstrap theme of your choice; once you've downloaded it (from Bootswatch.com for instance), its bootstrap.min.css file will get instantly opened up in your browser.
Just copy the file's URL and use it to replace the string assigned to the href attribute of the <link> element, in the index.html file.
And voila! It's precisely those colors, defined by your chosen theme, that get displayed in the browser now!
Option 2: Install Bootstrap using NPM
And here's another valid answer to your “How to use Bootstrap with Angular 4” dilemma!
Simply enter:
$ npm install bootstrap@3 jquery –save
It's this command that will integrate Bootstrap and jQuery into the node_modules folder of your Angular 4 project directory.
Moreover, it will include these 2 dependencies in the package.json file, as well.
Once properly installed, you can find both packages at:
node_modules/bootstrap/dist/css/bootstrap.min.css
node_modules/bootstrap/dist/js/bootstrap.min.js
node_modules/jquery/dist/jquery.min.js
Note! You have 2 options for integrating those files into your Angular 4 project:
add the file paths to the script array and to the file path of the angular-cli.json file
add the corresponding <script> and <link> elements to your index.html file
Option 3: Add NG-Bootstrap to Your Project
The great thing about this method is that you'll no longer need to add jQuery and Bootstrap dependencies. Ng-Bootstrap comes packed with a set of built-in native Angular directives which are already CSS and Bootstrap's markup-based.
Now, getting back to our initial “How to use Bootstrap with Angular 4” question, let's see how we install this NPM package.
For this, just enter the following command in your Angular 4 project directory:
npm install --save @ng-bootstrap/ng-bootstrap
Next, make sure you also install Bootstrap 4 to your project:
$ npm install bootstrap@4.0.0-alpha.6
And, the final step is to add the following files:
jquery.min.js
bootstrap.min.js
bootstrap.min.css
… to your .angular-cli.json file
Now you still need to import the Ng-Bootstrap’s core module — NgbModule — from its @ng-bootstrap/ng-bootstrap package.
To do this, just type the following import statement into app.module.ts:
import {NgbModule} from '@ng-bootstrap/ng-bootstrap';
All there's left for you to do now is to add the NgbModule to the @NgModuledecorator's imports array.
And since we're here, you'll find some more than “enlightening” info (chunks of code here included!) on the 2 different options at hand for importing the NGBModule:
either in your project's child modules
or in your the root module itself
… in this article here on Using Bootstrap with Angular.
Using The NG-Bootstrap Components: Which Are They?
With the NgbModule installed into your Angular 4 project, you're now able to use the Ng-Bootstrap components.
To leverage them in your app.component.html.
Speaking of which, here are the components at hand:
Accordion
Alert
Rating
Tabs
Carousel
Progressbar
Collapse
Datepicker
Buttons
Pagination
Typeahead
Popover
Timepicker
Dropdown
Modal
Tooltip
The END! Does this answer your “How to Use Bootstrap with Angular 4” question?
Which method of adding this front-end framework to your project is more suitable for you?
Silviu Serdaru / Apr 30'2018
Informative, entertaining, engaging and... a key revenue source! These are just some of your expectations regarding your Magento 2 blog, right? Well, then, get ready to check them all off your “wishlist” digging through my shortlist of can't-believe-its-free Magento 2 blog extensions.
From SEO-oriented to shipping-focused features, from functionalities centered on social media to those geared at enhancing page loading speed, these 9 extensions are, each, extra functionalities to inject into your blog.
So that it (your blog) should serve your specific needs and help you reach your goals. And that without having to “stretch” your budget (there are only 100% free extensions in this list)...
Oh, yes: and they're all wearing the signatures of certified Magento partners!
And now, let's get straightaway to these must-have Magento 2 extensions that you should be turbocharging your blog with:
all of them “spoiling” you with configurations that make customization unexpectedly easy
… blending perfectly into your blog's design and fitting into your codebase (no need to depend on an “army” of coding experts)
1. Magento 2 Image Slider
Let's review a visual/aesthetics-oriented extension first things first.
For, as above-mentioned, a “money-making” blog shouldn't be purely informative and helpful, but... engaging, visually-arresting, as well.
So, imagery does play its major part here!
Now here are a few of this extension's key features:
supports no less than 10 sliders
built-in support for inserting video text, image
one of those fully responsive free Magento 2 blog extensions
provides tons of animations, with Live Preview, for you to select from
supports OWL Carousel
conveniently intuitive UI
you're free to display it anywhere on your blog with CMS & Widget
2. Facebook Live Chat
A blog is the ultimate channel of communication with your brand's audience. With your e-store's regular and potential customers.
Well, then moving from standard communication to... instant communication is a must if you want to meet their expectations. And this is what makes Facebook Live Chat one of the must-have free Magento 2 blog extensions.
It's that chatbox incorporated into your blog that's powerful enough to turn “just” guests into loyal customers.
And now, let me point out to you some of its most powerful features:
there's a Like button and a store profile incorporated into the chatbox
user statistics capabilities
unlimited History Chat
you get to set upcoming events, define greeting text and integrate your e-store's Facebook profile into the chatbox
simple backend operations for enabling/disabling the chatbox displayed on your blog
familiar UI; a Facebook Messenger Interface-alike chatbox
3. Magento 2 Lazy Load
A must-have extension for your Magento 2 blog if you care enough about the user experience that you provide there. And page loading speed does play a key role in improving/negatively impacting it.
Moreover, besides optimizing your blog's performance, Magento 2 lazy load creates some aesthetically-pleasing image transitions influencing the UX.
But let's get deeper into details and “unearth” all those advanced features that make this extension one of the must-haves:
it helps you save your web server resources — saves bandwidth and minimizes server requests
it creates smooth, blurring effect transitions for your lazy load images
… and a smooth, visually-pleasing transition when users keep scrolling down your pages
it gives your blog a ranking boost by creating friendly code strings
it optimizes your blog's page loading time
you're free to enable/disable the “Lazy Load” mode for each one of your blog's pages
you get to set advanced time point for loading pages
4. Better SEO, One of the Free Magento 2 Blog Extensions You Should Be Using
Inject Better SEO into your blog and... propel it in the search engines results!
And it's not “just” packed with clever features, but ideally easy to use, as well. Built to fit into your blog's existing code structure and to empower you to customize it to serve your SEO goals in detail.
I'm talking here about:
meta descriptions
meta keywords
… that this extension's flexible enough to allow you to insert quick and easy.
Now that we've settled that Better SEO makes an ideally customizable, blog/store-friendly extension, let's check out its powerful features:
SEO checklist — a more than handy “TO Do” list, pointing out to you the SEO tasks to complete for reaching a high SEO score
its detects duplicate content issues
advanced HTML/XML sitemaps — one for the users, the other one to be used by search engines
structured data — implements schema structured data
metadata template rules — easy to define mass and dynamic metadata for your pages, categories, layered navigation
provides you with actionable SEO reports
rich snippets preview
cross links
social optimization
5. Exto Analytics
Applying a marketing strategy that lacks the proper data-fuel is like aiming at a target... blindfolded.
So, if relying on pure chance doesn't define you and if you want to go beyond the data provided to you by the native Magento 2 reporting functions, go with Exto Analytics.
Here are some more heavy-weighing reasons to do so:
real-time mobile dashboard, so you should remain “connected to” your data anytime anywhere
convenience at its best when it comes to handling your reports — you get to sort data by specific columns and even to turn off the columns feature itself
date range picker — compare and evaluate your blog's performance on different periods of time
your previous data gets added to your reports, as well, once you install the extension
a chart, enabling you to visualize all data reports in parallel
6. Magento 2 Admin Theme
From user experience to... admin experience.
As your own blog's admin, you should also consider making your dashboard's more user-friendly and intuitive.
For a high level of convenience on your side will bubble up, eventually, in the experiences that you'll create for your visitors.
But let's see specifically what makes Admin Theme one of the best Magento 2 blog extensions to use:
mobile optimized
easy to use and quick to customize
retina ready
clean, neatly structured code
a different interface for Login & Forgot Password
admin icon font
translation-ready
7. Magento 2 Infinite Scroll
It does precisely what its name says: it keeps loading content, without interruption, as your blog guests scroll down.
Fluidity in the way you present content to your readers translates into improved user experience!
And now, let's scan through this extension's specific features:
you can display and easily change the “Show” button, along with its loading text
the navigation bar can be placed anywhere on the page
you can implement it both on your category page and in the search page
the pages that your readers land on get automatically loaded
while scrolling down, your blog guests know, at all time, what section on the blog they're on
you get to customize your progress bar to your liking
users get to share the links of those specific pages that they reach during their scrolling (for instance, if they're on page 8 of your blog, they can bookmark/share the link of precisely that page)
8. Better Blog
Now, let's imagine that you don't own a blog yet, “only” an e-store.
And that now you want to integrate a simple blog, as well. One that should:
be conveniently easy to configure
have a beautiful layout design to “wow” your readers with
load fast
come packed with much-needed backend features, making updating content unexpectedly easy for you, the admin
Checked, checked, checked!
The Better Blog is undoubtedly one of the must-go-to Magento 2 extensions no matter the size of your current e-commerce site.
Once integrated into your Magento store's backend, you'll get to manage both your store and your blog from the very same place.
Here are the main reasons why it still is one of the best Magento 2 blog extensions:
SEO friendly: SEO-friendly URLs, metadata information, XML sitemap
open source code
layered navigation, with a significant impact on UX (your blog guests get to quickly track precisely those posts that they're looking for
out-of-the-box comment functionality: Disqus Comment, Facebook Comment
blog topics
built-in product recommendations feature: "Who Bought This Also Bought", "Auto Related Products", "Frequently Bought Together"
the option to integrate your store or your blog's sitemap
responsive design
social sharing buttons
blog widgets: show your (recent) posts on your site's homepage (sidebar here included)
In short: you get to integrate a simple blog with your e-store with no need for a third-party framework!
Moreover, you'll be managing comments, categories, posts, right from your Magento 2 admin, quick and easy.
And you'll get informed each time when a blog guest has posted a comment, not to mention that the extension grows into a powerful “ally”, supporting your SEO efforts.
One of the must-have Magento 2 extensions without question!
9. Magento 2 SMTP
A powerful extension to “turbocharge” your Magento 2 blog with so you:
gain total control over your email customization process
get enabled to run test sections on your Magento 2 SMTP server
And it does all that by providing your blog with configurable port and host.
Now, let's go through its cool features:
it stores all sent emails logs
built to support 20+ SMTP service providers
enables you to test how well your current email setting's doing
it empowers you to customize your emails in the slightest detail
The END! These are the 9 best Magento 2 blog extensions that you should be using. Scan them through, “weigh” their feature loads while setting them against your own needs and growth plans for your blog and... go for the most suitable ones!
Adriana Cacoveanu / Apr 27'2018
What's the deal with the virtual DOM? How React virtual DOM works precisely? It's significantly faster, without question, and it brings a whole series of benefits to coding.
How come?
Which efficiency issues of the “real” DOM does it solve? And what makes the way that React.js manipulates the DOM better than the “standard” way?
Let's get you some answers:
But First: What Is the DOM Anyway?
"Document Object Model."
It's only but natural that, before we get into details on React and the Virtual DOM, we gain a deep understanding of the DOM itself.
Therefore, here's a definition that hopefully sheds enough light on this concept:
DOM is a tree-structured abstraction of (or an in-memory representation, if you prefer) a page's HTML code. One that preserves the parent/child relationships between the nodes within its tree-like structure.
Any better?
The major benefit is the API that it provides, that allows us, developers, to easily scan through the HTML elements of a page and to manipulate them as needed. For instance:
to add new nodes
to edit a given node's content
to remove specific nodes
And What Is DOM Manipulation More Precisely?
It's the very process that enables the content on any of your website's pages to be dynamically updated.
Needless to add that it's JavaScript that you would use when handling the DOM. Also, methods such as:
removeChild
getElementByID
… are included in the API that the “actual” DOM provides you with.
What Efficiency Challenges Does the "Real" DOM Face?
Now, before we go back to your initial “dilemma” (“how React Virtual DOM works”), let's see why a “virtual” DOM was even needed in the first place.
What efficiency issues of the “real” DOM does it address?
So, it's JavaScript that we use as we manipulate the DOM, right? And it used to work fantastic back in the days when static UIs would “rule” and the concept of dynamically updating nodes wasn't yet... “invented”.
Well, since then things have changed...
The DOM manipulation, once the core process of all modern interactive web pages, started to show its limitations. And that because the “real” DOM would update a “target” node along with the entire web page (with its corresponding layout and CSS).
For instance, imagine that:
You have a list of items and it's just one of those items that you need to update. Traditionally, the “real” DOM would re-render the entire list and not exclusively the items that receive updates. See?
Just think of a scenario where you have an SPA (Single Page App). One with thousands of dynamically generated nodes, that would all need to “listen to” lots of future updates and to re-render them in the UI.
It's here that things get discouragingly... slow!
The real DOM can't cope with pages carrying thousands and thousands of components to be re-rendered when updates are being passed through.
It's in this context here that the virtual DOM stepped in! And it's React that makes the most of it.
Clear enough?
How React Virtual DOM Works: Snapshots, Diffing and Reconciliation
Before we get into the “how”, let's shed some light on the “what”. What is the “virtual” DOM?
A light-weight abstraction/copy of the HTML DOM, having the same properties as the “real” one. The only difference is that it can't write to the screen like the actual DOM “can”. Also, it's local to React.
A copy of the actual DOM that you get to update “intensively” without impacting the real DOM.
Note: do keep in mind that it isn't React that introduced this concept since there are plenty of other libraries who're using it.
Snapshots, Diffing and Reconciliation
Now, let's get into details on how React virtual DOM works.
a. First of all, React takes a virtual DOM snapshot before doing any updates.
b. It will then use it (this record of the DOM state) to compare it against the updated virtual DOM, before applying any changes to the actual DOM itself.
And it's a “diffing algorithm” that supports all this comparing and enables React to identify any changes. To detect the updates that have been applied.
Also, the entire process is called “reconciliation”:
Whenever updates need to be made to the actual DOM, React updates the Virtual DOM first, and then, once it has done its compairing, it syncs the Real DOM.
In other words: before applying any of the requested updates, React makes a copy of the virtual DOM, that it will then set against the updated virtual DOM (diffing). It's during this diffing-reconciliation process that React detects the changes that have been applied and identifies the objects to be updated.
And it's precisely those objects that it will update in the actual DOM.
The huge benefits?
virtual DOM updates a whole lot faster
it updates exclusively the “target” nodes, leaving the rest ones of the page alone
Summing Up
To recap, let's try and sum up this whole “How React Virtual DOM Works” guide here to its bare essentials.
So, here's how React updates the DOM in 3 simple steps:
first, it applies the given updates to the whole Virtual DOM
then, it compares it with the snapshot of the virtual DOM that it will have taken, using an algorithm called “diffing” during this whole process of comparing and spotting any changes/contrasts
then, it's specifically (and exclusively) those changed elements that it updates in the actual DOM
The END! Have I managed to make this process any clearer for you? Can you now see what's “under the hood” of the way React updates DOM?
And the specific reasons why it's so much faster than the real DOM manipulation?
RADU SIMILEANU / Apr 26'2018
Whether you're "constrained" to migrate content to Drupal 8 or you're just eager to jump on the Drupal 8 bandwagon and harness its much-talked-about advanced features, the most important “warning/advice” to keep in mind is:
Don't migrate mindlessly!
Meaning that before you even get to the point of:
triggering the Migrate module's capabilities and adjusting them to your migration project's needs and requirements
selecting and combining all the needed contrib modules
writing down your YAML files for carrying out your content migration process
You'll need to think through every little aspect involved in/impacted by this process:
your goals
your growth plan
your current site visitors' complaints and suggestions
That being said, here's more of a “backbone” or summary of the migration workflow, one that highlights the:
main phases to go through
the right approach to the whole process
Drupal-specific concepts and tools to use
Do NOT expect a very detailed, highly technical tutorial, though!
As for the Drupal concepts that you'll need to be already (more than) familiarized with once you launch your migration process, maybe you want to have a look at this guide here, on Understanding Drupal
And now, let's delve in:
1. The Migration Workflow: 4 Key Phases to Consider
Here's the entire process in 4 steps (so you know what to expect):
first, you'll need to migrate your data into the destination nodes, files and paragraphs on the newly built Drupal 8 site
then you'll migrate data into date, image, taxonomy, address fields and file
next, you'll move your precious data from JSON and CVS files
and finally, you'll complete your migrations from the UI and the terminal
2. Are You Upgrading from Drupal 6 or 7 or Migrating From a Different System?
And here's what to expect depending on your answer to the above question:
if you migrate content to Drupal 8 from an older version of Drupal (6 or 7), then you're quite “spoiled”: a lot of hard work has been done, by the Drupal community, for turning this migration process into the official path to Drupal 8; you could say that the solid framework has already been set up, so all there's left for you to do is to... take advantage of it!
if it's from a whole different system that you're migrating your site (let's say WordPress or maybe Joomla), then... expect it to be a bit more challenging. Not impossible, yet more complex
3. Plan Everything in Detail: Think Everything Through!
Now with the risk of sounding awfully annoying and repetitive, I feel like stressing this out:
Don't migrate... mindlessly!
Plan everything in the smallest detail. Re-evaluate the content on your current site and its “load” of features.
Take the time to define your clear goals and to put together your growth plan (if there's any).
Then, do lend ear to what your current site visitors have to say, filter through all their complaints and suggestions and tailor your final decisions accordingly.
It's only then that you can go ahead and set up your content architecture.
4. Start With the Structure: Build Your Drupal 8 Site First
“But I haven't picked a theme yet!” you might be thinking.
No need to! Not at this stage of the migration process.
You can still build your Drupal 8, from the ground up, even without a theme ready to be used. You can add it later on, once you have the final version of your content!
But the site itself, its solid structure, this is a “must do”. It's the very foundation of all your next operations included in your migration workflow!
5. Deep Clean & Declutter! Take Time to Audit Your Content
Don't underrate this very step! For moving over all that clutter, that heavy load of unused, outdated features and all those chaotic, crummy pages will only impact your Drupal 8 site's performance from the start.
So, now it's the right time to do some... deep cleaning!
Audit your content, your features, plugins and other functionalities included in your site's infrastructure and... trim it down by:
relevance (are you using it?)
quality: keyword-stuffed, unstructured pages (a heavy pile of them) will surely not give your new Drupal 8 site any significant jumpstart in rankings!
6. About the Migration Module Included in Drupal 8 Core
Using this dedicated module in Drupal core to migrate content to Drupal 8 comes down to implementing the:
Extract- Transform-Load process
Or simply: ETL.
In Drupal — as related to the Drupal migrate module — these 3 operations come under different names:
the source plugin stands for “extract”
the process plugin stands for “transform”
the destination plugin stands for “load”
7. Time to... Migrate Content to Drupal 8 Now!
Now it's time to put some order into that “pile” of content of yours! To neatly structure Google Sheets, XML files, CVS files etc.
And here's the whole “structuring process” summed up to the 3 above-mentioned plugins: source, process and destination.
Source:
XML file
SQL database
Google Sheet
CVS file
JSON file
Process:
iterator
default_value
migration_lookup
concat
get
Destination:
images
users
paragraphs
nodes
files
And here's a specific example of how to “glue” data for a neater and ideally structured content architecture:
Before the migration:
A: First Name- Kevin
B: Last Name: Thomson
C: Department- Commerce
After Migration:
A: Name- Kevin Thomson
B: Department- Commerce
8. 4 Contrib Modules to Incorporate Into Your Migration Workflow
As already mentioned, the migrate content to Drupal 8 process also involves using a combination of contrib modules.
Speaking of which, allow me to get them listed here:
Migrate Tools
Migrate Source CVS
Migrate Spreadsheet
Migrate Plus
The END! This is the tutorial on how to migrate content to Drupal 8 trimmed down to its bare essentials.
To its core phases, key steps to take, main Drupal concepts to “joggle with”, right approach/mindset to adopt and best tools/modules to leverage for a smooth process!
Any questions?
RADU SIMILEANU / Apr 24'2018