#Ask2Shamik Presents Microservice communication using Feign Template Spring boot
In this Video, we will learn What do you mean by Feign client?
Step by Step implementation of Sync communication via Feign client using Spring Cloud


Ask2Shamik Presents a simple way to understand Microservice communication by Live coding

#Ask2Shamik Presents Microservice communication via Rest Template using Spring boot
In this Video, we will learn What do you mean by Microservice Communication?
Step by Step implementation of Sync communication via Resttemplate using Spring Cloud
How to Communicate with the config server to the resolute property using Spring-boot?

Ask2SHamik Presents Eureka server Live coding with Step by Step explanation

Ask2SHamik Presents Eureka server Live coding with Step by Step explanation, to grasp Microservice concepts come and subscribe right now.
#Ask2shamik #javaonfly #shamik #microservice #springboot #java #Dzone #JCG
Ask2Shamik presents Key points you want to know about Service discovery(Eureka server).
Take away from this video::
What is service discovery pattern?
Why is it needed in Microservice?
Demerits of the Traditional Load Balancer.
What is Eureka server?
How Multiple Eureka server communicate with each other.
Microservices comminucation.

Ask2Shamik Presents a brand new video on Microservice Pivot components(English Version).

Microservice is Hot Cake now To cook Microservice best way, you want to know what ingredients are most important? So you get an optimum experience, Check this video Which Touch bases on Each important components like Zuul, Eureka, Consul, Zookeeper, Ribbon, Zipkin, Hytrix, Splunk, ELK

Have a dilemma What to choose for your new project Monolith or Microservice? Check this Video on Ask2Shamik.

Have a dilemma What to choose for your new project Monolith or Microservice? Ask2Shamik presents 5 Checklists to choose wisely Monolith or Microservice.

Ask2Shamik Presents, A talk on Microservice Characteristics.

Ask2Shamik Presents, A talk on Microservice Characteristics, where you can learn what are the common misconceptions on Microservices.
Microservices Characteristics and deep dive to each character.
Subscribe to Ask2shamik to get more latest videos on Microservices.
#ask2shamik #javaonfly #shamik #Microservices #Springboot #dzone #jcg

Ask2Shamik Presents Why Microservice?


Ask2Shamik Presents a video called Why Microservices,

 Take away from this videos

1. Why Microservices needed?
2. What are the pain points of Monolith Architecture?

3. What are the benefits of Microservices? Please see and subscribe on Ask2Shamik, it is a brand new Youtube channel from Javaonfly production.

Hope you enjoy it!!!

Building Layered Architecture just in 3 minutes: Final Part

I am sure that you have enjoyed the part1 where just a blink of an eye we set up an in-memory database, create a Repository class using Spring Data JPA and insert data using initDataloader by Spring boot.
In this tutorial, we will expose those employee data to UI, The Classic  MVC pattern using Spring boot.
The Service Part:: In my previous article I have created an EmployeeRepository interface & Spring boot provide its implementation at runtime, now I want this data to be guided to the Service Layer. To achieve that, I need to inject that Repository to Service layer, and define the CRUD methods which are doing nothing but calls the Repositories CRUD methods, Now you may be wondering, the EmployeeRepository interface  is empty and it extends CRUDRepository, where you can only find the declaration of CRUD methods, Now the question is, where are the Implementations ?

Again the Answer is Spring-boot on the runtime it creates an Implementation class which has all the CRUD methods implementations. Interesting isn't it,  We just declare an interface and Container creates  all the stuff for us, and we save lots of our precious time, get rid of silly mistakes etc. really an IOC(Inversion of Control)
Let's see How the Service part looks like, Please note for simplicity, I print the messages in the console using Sysout, But in the real project don't ever do this, use Logger always.

package com.example.layerdArchitechture.service;
import java.util.Optional;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import com.example.layerdArchitechture.entity.Employee;
import com.example.layerdArchitechture.repository.EmployeeRepositiry;

@Service
public class EmployeeService {
@Autowired
EmployeeRepositiry repo;
public Employee addEmployee(Employee emp) {
emp = repo.save(emp);
System.out.println("Employee saved::" + emp);
return emp;
}
public Optional<Employee> findEmployeeById(Long empId) {
Optional<Employee> emp = repo.findById(empId);
System.out.println("Employee found::" + emp);
return emp;
}
public Iterable<Employee> findAllEmployee() {
return repo.findAll();
}
public void deleteEmployeeById(Employee emp) {
repo.delete(emp);
System.out.println("Employee deleted::" + emp);
}
}

Yeah, Our service Layer is ready, Now I need a Controller part which will act as a delegator, takes data from UI send it to Service and Vice versa, I personally recommend to use Controller as a thin layer which just porting data from UI to service and service to UI and maybe doing some data format conversion stuffs nothing more than that.
Before showing you the code of the controller let me remind you I have not use a proper Spring controller which forward the response to a view rather, I use a RestController which directly deliver the response payload in JSON / XML format, based on the Produces property. by default it is JSON.
In Controller, we map the URL pattern to a controller method bind any data coming from a form into a  Model the "M " part of MVC.
Let's see the code.
package com.example.layerdArchitechture.controller;
import java.util.Optional;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import com.example.layerdArchitechture.entity.Employee;
import com.example.layerdArchitechture.service.EmployeeService;
@RestController
public class EmployeeController {
@Autowired
EmployeeService service;
@RequestMapping("/TCS/employees")
public Iterable<Employee> findAllEmployee() {
return service.findAllEmployee();
}
@RequestMapping("/TCS/employee/{id}")
public Employee findbyId(@PathVariable Long id) {
Optional<Employee> emplyoeeContainer = service.findEmployeeById(id);
return emplyoeeContainer.isPresent()?emplyoeeContainer.get():null;
}
@RequestMapping("/TCS/employee/addDemoEmployee")
public Employee addEmployee() {
Employee emp = new Employee();
emp.setName("Demo Demo");
emp.setSex("M");
emp.setAddress("Demo");
return service.addEmployee(emp);
}

@RequestMapping("/TCS/employee/delete/{id}")
public String delete(@PathVariable Long id) {
Optional<Employee> emp = service.findEmployeeById(id);
if(emp.isPresent()) {
 service.deleteEmployeeById(emp.get());
 return "deleted Successfully";
}
return "Employee Not Exists, Not able to delete";
}
}

If you look at the Controller class, You can see I autowired the Service class then wrote CRUD methods(also map them with Url pattern) as a wrapper which internally calls Service Wrapper method which calls the Spring boot's Repository Implementation class.
Now If I run the spring boot application class and hit following URL in the browser
http://localhost:8080/TCS/employees
I can see the following response in Browser in JSON format.
[
  {
    "id": 1,
    "name": "Shamik Mitra",
    "address": "BagBazar",
    "sex": "M"
  },
  {
    "id": 2,
    "name": "Samir Mitra",
    "address": "BagBazar",
    "sex": "M"
  },
  {
    "id": 3,
    "name": "Swastika Basu",
    "address": "Baranagar",
    "sex": "F"
  }
]
Hope you enjoyed the post.

Layered Architecture Up and Running just in 5 minutes:: Spring Boot Part 1

This is a two-part series where I will show how to create a Layered architecture with Spring Boot. 
What is a Layered Architecture: In a simple term when we building an enterprise application, we maintain different layers to encapsulate layers specific logic so that, it can't be spill over to another layer. When we think about an enterprise application we can imagine three important layers of the architecture. 
1. User Interface: Which interact with the end user, shows data to them, take user input, take command from them etc.
2. Business Layer: Based on the User command and the data captured from the user(AKA Form), It takes a domain-specific decision, like what to do with the data, which table to look, how to manipulate the data which comes from the database, so it can be presented in UI.
3. Persistence layer: This layer captures the data and persists it, same also capture any updation, deletion, and change of the state of the data, so you can consider this layer as, maintaining a state of the Application Specific Data.
Irrespective of your application is up or down it stores the state of the data once committed.

By layered architecture, we create a logical encapsulation of each layer like all the code, respect to UI stays in the UI layer, all code regarding business logic stays in Business Layer etc.
Each layer communicates with its adjacent layer, but never communicate with another layer which is not adjacent.
So if you have an application which has three layers UI, Business, DAO, UI communicate with Business, Business communicate with UI and DAO and DAO communicate with Business. By this way, we can reduce coupling, makes layers reusable and welcoming the future change in the architecture. Each layer has it's own pattern to accommodate future change an make the layer reusable.

We all know Spring provide different components for each layer like, for UI you can use Thymleaf or Spring template or any other UI framework like JSF, for Business layer,  you can use Controller and service, Also you can inject different framework like Struts in to it, For Persistence layer you can use Spring data JPA, Hibernate, JDBC template whatever. But the problem is you need to add all the plugins/Jars in your pom.xml find the right version of the dependencies in the classpath, If version mismatches it will not going to work also you need to add many Spring specific annotations or XML entries in Spring XML files to use those component/plugins in your layered architecture, which is a cumbersome method also you need to package them and deploy them in an application server, so many manual interventions needed. Spring addresses this problem and comes out with a solution which they Called Spring Boot.
"Spring-boot works by convention over configuration" --  It means you don't have to think about configuration entry, only pay attention to your business logic, whatever the component you want to use if those are mentioned in your classpath Spring boot is so smart that it will understand you want to use the same and configure a fully working components for you. Say you want to use JPA in your project, if you import Spring boot starter JPA module, it understand you want to use it and on the fly it creates the Spring template a repository and utility CRUD methods for you, without Spring-boot you need to configure the JPA template, Initiate a session factory from template, get the session etc., those are just not required here, Spring boot is so powerful that it can do it for you, and of course If you want to control the configuration by yourself you can override them and use your own configuration.
In this tutorial, I will show you How to create an MVC Layered architecture Step by Step using Spring-boot, and you will be amazed within five minutes you can create an MVC architecture Up and running, which previously take lots of time, and lots of head scratching in case of version mismatch.
As it is a two-part series, In the first part we will setup a database, insert Employee data into the database using JPA repository.

For this tutorial, we will use the following components

1. H2 Database: Which is an in-memory database, It persists the data until the application closed. 

2. Spring Data JPA: we will use Spring data JPA component for CRUD operation in Database.

3. Rest Controller: Which will show the data in a JSON format rather than forwarding response into a View, unlike traditional MVC.

Step 1: Go to start.spring.io and download a template project by selecting Web, JPA, H2 Module.
Step  2: Import that project into eclipse as maven project.
Step 3: Check the pom.xml, spring-starter-web,spring-starter-data-jpa,h2 module entry will be there. These are Spring boot packages which contain all the necessary dependencies for jpa and web and maintaining right version of dependent jars so that there will be no version mismatch problem.

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.example</groupId>
<artifactId>layerdArchitechture</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>

<name>layerdArchitechture</name>
<description>Demo project for Spring Boot</description>

<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.0.4.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
</properties>

<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>

<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>

<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>


</project>


Step 4: Go to application.properties file under src/main/resources and make the h2 console enabled true so we can see the data inserted in the h2 database.
spring.h2.console.enabled=true

Step 5 : Let's create an Employee entity.
package com.example.layerdArchitechture.entity;

import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;

@Entity
public class Employee {
@Id
    @GeneratedValue
private Long id;
private String name;
private String address;
private String sex;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getAddress() {
return address;
}
public void setAddress(String address) {
this.address = address;
}
public String getSex() {
return sex;
}
public void setSex(String sex) {
this.sex = sex;
}
@Override
public String toString() {
return "Employee [id=" + id + ", name=" + name + ", address=" + address + ", sex=" + sex + "]";
}

}


Step 6: Now create an EmployeeRepositiry interface which will extend the CrudRepository interface, Spring-boot on the fly creates an implementation and create all the utility(crud) methods we don't have to do anything but just declare the interface.

package com.example.layerdArchitechture.repository;

import org.springframework.data.repository.CrudRepository;
import org.springframework.stereotype.Repository;

import com.example.layerdArchitechture.entity.Employee;

@Repository
public interface EmployeeRepositiry extends CrudRepository<Employee, Long> {


}


Step 7. Now create a Data Loader class which will insert few employees into the H2 database using the above-created repository. For that, I have to auto-wire the EmployeeRepository interface. Please pay attention to loadData method here I create a list of the employee then iterate of the list and save those data in H2 database by using of lambda expression in Java8.

package com.example.layerdArchitechture;

import java.util.ArrayList;
import java.util.List;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;

import com.example.layerdArchitechture.entity.Employee;
import com.example.layerdArchitechture.repository.EmployeeRepositiry;

@Component
public class InitDataLoader {
@Autowired
private EmployeeRepositiry rep;

public void loadData() {
createData().forEach(
emp->{
rep.save(emp);
System.out.println("Successfully saved ::" + emp);
}

);

}

private List<Employee> createData() {

List<Employee> employees = new ArrayList<Employee>();

Employee emp = new Employee();
emp.setName("Shamik Mitra");
emp.setSex("M");
emp.setAddress("BagBazar");
Employee emp1 = new Employee();
emp1.setName("Samir Mitra");
emp1.setSex("M");
emp1.setAddress("BagBazar");
Employee emp2 = new Employee();
emp2.setName("Swastika Basu");
emp2.setSex("F");
emp2.setAddress("Baranagar");

employees.add(emp);
employees.add(emp1);
employees.add(emp2);
return employees;
}

}

Step9 : We need to tell our application to scan all Spring beans and find the JPA repository and entity files and register them as Spring bean, so we will use two special annotations 
@SpringBoot and @EnableJpaRepositories, on top of the Spring boot main class. Please note that in @EnableJpaRepositories annotation,  I mentioned the root package from where it start finding the repository and entities
package com.example.layerdArchitechture;
import org.springframework.beans.factory.annotation.Autowired;import org.springframework.boot.CommandLineRunner;import org.springframework.boot.SpringApplication;import org.springframework.boot.autoconfigure.SpringBootApplication;import org.springframework.context.annotation.Bean;import org.springframework.context.annotation.ComponentScan;import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
@SpringBootApplication
@EnableJpaRepositories("com.example.layerdArchitechture")
public class LayerdArchitechtureApplication {

public static void main(String[] args) {
SpringApplication.run(LayerdArchitechtureApplication.class, args);
}

@Bean
public CommandLineRunner loadData(InitDataLoader loader) {
return args->loader.loadData();
}
}
Pay attention to the loadData method, this is Java style bean configuration where I call InitLoader's load data method. CommandLiner bean will be invoked when the application starts and inject InitDataLoader as a Spring bean(IOC), So I can expect upon starting my Spring boot application I will insert all the data into the database.
Step10 : Start the application and we can see employees are inserted in the database to verify that go to the following url http://localhost:8080/h2-console we can see the following 
Conclusion: we have successfully insert data using Spring boot, one thing to observe here, To insert data into the database we don't have to write a single line of CRUD code and JPA template configuration code. Spring boot does the same behalf of us. In the second part, I will show you how to expose hose data as a JSON format so any UI framework(Angular5) consume data as JSON or use this layered architecture as a Microservice(Few Changes needs to make it Microservice component), Stay tuned.

Dilemma on Utility module , making a jar or separate Microservice?

In my previous article, I talked about How you can come to a conclusion about what to choose for your new project Microservice or Monolith? 
As an architect may you follow the points what I mentioned in the previous article and come to a conclusion that you will be going to use Microservice Architecture, A big cheers for you, You promoted yourselves as a first-class citizens of the new era of  Digital world, but(when but is used it means the previous word is meaningless/no value until you solve the next part of but :)) what's next , you heard about Microservice Architecture demands Componentization of service but what is actually a component in Microservice world?
In this article, I briefly discuss What does it mean by Componentization, and when we need to do Componentization of a Utility module what problems we faced?
Componentization of service:: By the word "Microservice", we can understand it is a suite of small services, so the main objective is breaking a project into multiple services. But what does it mean by services, Which kind of services are those? Are we talking about a Service layer in a layered architecture, or a service wrap into the jar so-called "Library" or publish it through REST API, what is that?
To understand always remember "Microservices means an Independent thing(Component/Service whatever you call it) that can be deployed independently and update independently manage it lifecycle independently".
When you Create a Microservice very clear about that, There should not be any confusion, Maybe many tricky situations come but stick to basics "Microservice mean an independent deployable thing".
One of the best dilemmas is,  Suppose you have a utility module, now you are in a confusion what should you do,  wrap utility module into a jar and all other Miocroservice use the same as a jar , so that there should not be a code duplication or expose a utility service with API fiction(Separate Microservice)  so others can consume it.
One may think, making utility module as a jar   seems severely wrong in Microservice workspace as Microservice means several independent small services so Utility module  should be published as a service,  so Utility module must be a Microservice,
Why is the Utility module as a Jar a bad Idea?
when you import utility module as a jar you limit yourself, Now Your service is dependent on utility module which packaged as jar, If utility modules  java version upgrade you have to upgrade your service unless your program not able to use that jar, Now you both should stick in same language java(Although once service is written in one language, I saw rarely that is rewritten in another language), having said that,  you should not confined  your Microservice to a language(Java), Think your utility module is used by many microservices as a jar, so if you want to upgrade your version  it should be backward compatible, say you want to create functions which can easily be achieved by java9 but you can not upgrade your utility as other microservices not upgraded to java9.
Why Utility module as a Jar is a good Idea?
As a counter logic many can argue, To build a Microservice itself, we  import jars/Libraries like Sprint boot starter parent, or GSON, Jackson etc, so why we should not package our utility as jar why it is a bad idea , Even let me tell you, many architect think it is a brilliant idea as  it solves many purposes.
If we are not using utility module as a jar then we have two options
1. Duplicate the functionality of a utility module in all Microservices.
2. Create a Microservice called Utility service and publish an API to invoke utility methods.
These two options has there it's own demerits.
1.  Duplicate the functionality of utility module in all Microservices::Here the objective is duplicate the code to all Microservices who consumes Utility module, so that there is no utility module as such, all are the part of local codebase, but it is against the DRY principle and bad idea, If the utility module is holding some complex code which associates many classes, unnecessary that has been copied to all Microservices and any future changes in that, has to be copied over all Microservices. So there is a problem of maintainability, say we have a utility module which calls an External service and get a complex response then this module parsing the responses apply some business logic on it and create some analytical data which exposes as public methods (Java API), Now if you copied that complex algorithm written in Utility module in to multiple services certainly it is a very bad idea, think how many time duplication has been done, and if now a new analytical data is needed you have to add it in all Microservices , oops what a pain while I am writing this I am getting feared to imagine the scenario. So certainly it does not work unless you have a very small portion of utility code and which is not changeable say a generic code which deals with Date, TimeZone, and Format, ( A single class with multiple static methods), Copying that class in all Microservice is a onetime effort.
2. Create a Microservice called utility service and publish an API to invoke utility methods : 
So, you create one Microservice where all utility methods are dumped, as this is a Utility module all Microservice or most of the Microservice Communicate with it, So Every Microservice has a link with this Microservice, So just imagine the picture every Microservice dependent(HAS-A) on it , So If that service is down due to some erroneous code or all instances down due to some major causes, All service will be down Total Microservice architecture is doomed then How it is different than a monolith?
As per Microservice, Partial failure may happen but total Microservice will not down ever, It has 100% uptime.
So certainly creating a Utility Microservice not looking very promising.
As an Architect What should you do now It is like a double-edged sword, whatever the option you choose, You have to deal with demerits.
Taking a decision :: As an architect, we always dealt with Demerits and try to choose in which case, which one to choose so it has least demerits, So, As an architect our favourite answer is "It depends on scenario", and we are hated for that answer, even Juniors are mocking us he does not have direct answer always said it depends.
Here I also giving answer in a same way, It depends :) How your utility method is written, in a one liner if your utilty functions are statless use Jar , If your utility functions are dealing with State use Microservice.
If you observe minutely, you can divide your utility methods into two categories.
One type of utility modules takes and input doing some operation on it and returns a result so there are no side effects and it is independent of any parameter state, every time you pass the same parameter you got the same result, In this case do not publish those as separate Microservice, as this is not dealing with state , so no need to publish as REST resource, because no create, Put ,delete operation done by this, So, you can package this as a jar and use in every microservices, Think about Utility jars like Jackson,Gson thy take request do some operation returns some data structure, But if your utility takes and request fetch some data from database, do some operation and return it or save that state in database, then it would be ideal to publish it as Microservice.
Here also think about the failure, if you make a synchronous operation then your utility module fail means all chain of Microservices is failed, so think can you make utility operation as async, is the utility module require in sync fashion? As an example Event store or store Audit information, Those are cross-cutting concerns and a utility operation so it is not a part of main business flow so we can make that async so that if audit information or even storing operation fails it does not block the business flow and your service should not down for it.
If you have to use utility operation in sync fashion, then implement a circuit breaker and return a default path, so that if the operation fails it does not block all Microservices, at least if show a default path and show the user a message. or users can perform other operations rather that this function (Say your transaction function calling a utility function to check Transaction amount and based on that bank giving you some credit points, now if that function fails we can show a default message "at this time we can't process transaction" but user can do other options like balance check. So any moment of time the service will not totally down.
Conclusion:: There are also different shades of Utility some are mostly read-only but has database operation, some may use in-memory caching so take decision wisely How you want to write your utility one,  jars or Microservice.

Microservice Vs Monolith:Which one to Choose?

"Microservice" is a Buzzword in the industry right? Apart from Microservice design rest of the design marked as "Monolith". But as an architect when you trying to create a new Software based on a specific Domain, What will you do , adopt Microservice without judging anything as this is the hit one,  every one runs behind it or just stop for a second and think about your company infrastructure, employee expertise and based on that choose Microservice.
As an Architect it is our duty to choose Microservice when it is required, Not going with the flow and use Microservice. This tutorial I will talk about few aspects, which I think most important before you adopt Microservice, and I am open to hearing any other proposal or aspects to cover which I missed, Even counter logic, This is my realization while working In Microservice and so-called Monolith, both the architectures.
Let's begin with the aspects to consider before choosing Microservice as default architecture.
1. Project associates Strength: While you are getting a project first parameter to consider is to How many associates are going to work in this project and what are their experiences. If your strength is low, like 10-12 don't go for Microservices , as Microiservices means independent services , and the motto is "You Build it you run it" so There should be one team have complete ownership on a Microservice, If you associate strength is small statistically each one or two persons will have ownership of two or three Microservices, and it creates a critical problem, Knowledge is confined to them and they will become a Pivot employee, On Another side If one/two persons take ownership of two or three microservices, the motto, Developers should only focus about a small portion but know everything about that service is going to break. So think about that.
2. Microservice and It's associate Knowledge: Microservice is a new architecture and there are various concepts inter-related with Microservice like Distributed Artechtures rules, like High availability, Resiliency, Service discovery, CAP theorem, , Domain-driven Design, Circuit Breaker pattern, Distributed cache mechanism, Route Tracing etc. Not only that DevOps culture is tightly coupled with Microservice in order to unleash the full power of Microservice, So you need to know CI/CD tools and it's culture(Automated deployment).The team should be efficient to drive all of these for a Microservice, If the Microservices deployed in cloud or container, Team should have an understanding of Cloud architecture(PCF, Amazon, Bluemix etc) or Container architecture.(Docker, Docker Swarm, Messos, Kubernetes etc). Think carefully about the Team knowledge, always think when you are dealing with Microservices it means multiple services and multiple teams, so each team member should have clear understanding all of the associated knowledge to run it's Microservices independently, If you don't have such leisure make sure for each team one or two persons, who know all the associated knowledge so they can educate their team. 
Organization Infrastructure: Another important dimension is Organization infrastructure, Before adapting Microservice always check in what mode you organizations work, As per Conway's law whatever your organization structure that is reflected in your code implementation, So check,  is your organization adopt the Agile technology? Is the team  built with cross-technology member (Like UI, Middle layer, Database), What are the deployment and QA test mode, is your organization follows manual deployment and manual testing, or they are already or going to adopt DevOps culture, Where the artifacts going to deploy, is Organization has few servers where you install Application servers and deploy your artifacts or Organization move to Cloud. Based on these parameter choose are you going to adopt Microservices or not , If Organization has its own servers(few) where application servers are installed, in that case, all the different Microservice artifacts going to deploy same server space, which does not fulfill the Horizontal scaling so Microservices adoption will be in danger, also in case of Manual testing and Manual deployment if you adopt Microservice it is a nightmare for Ops team and QA team, also for developer it is nightmare to integrate the changes and test in SIT. 
Domain Criticality: Check the nature of the domain you are working on, sit with Business analysist understand how critical is this, is there such a need where we need to break domain into subdomains, so that related function can be encapsulated under a context, based on that take decision if Domain is not very complex , It is better to stick with Monolith, no need to create Context map and encapsulate subdomains in a Bounded Context.

The budget of the Project: This is another important aspect to look on. Think about the budget of the project, the nature of the project is it fixed bid or TNM type. Microservice projects are expensive than monoliths as it needs more server or cloud infrastructure, Automated pipeline, Count of Resources, all team should be cross skills team, So in terms of  infrastructure and resources level it is much more costly than Monolith, so think about what the budget aligned with your project(Secretly I am giving one tip to you, Please do not publish to anyone, As a good Loyal Architect Always Think about Revenue margin, a modular monolith is better than Microservice in case of small budget it serves the purpose of  Revenue and In future if you want to move Microservice also that is possible.)
Conclusion : Nowadays, Microservices technology hit in such a way, Novice or mid-level developers think Monolith means bad it is a BBOM(Big ball of mud) all modules are strangled with each other, but If you maintain modular approach, It is good and many cases it is better than microservices when our resources are finite. So choose carefully not go with the flow. I always recommend to go with Monolith first but makes it Modular, If you use Java9 use Module else you can adopt SOA but start with modular one and when there's an actual need, like Module boundaries are going to leak, or you can't maintain Acyclic Graph (DAG) among modules, Then think for beaking Monolith basis on DDD and can lean towards Microservices. Not before that, what I am trying to say is Do not use Microservices as Others are adopting it, Adopt it when you need it. Many cases I saw some project try to adopt Microservice but just because of lack of knowledge they create such a project  which is neither a Microservice and nor a Monolith, They create many services, without a proper encapsulation and each service chained with other in such a way one Agile team can't take an independent decision on deploying one Microservice, multiple teams ,  multiple artifacts depends on each other , Teams are blaming each other as functionality span over many services , It is a horrible situation, I will discuss that  in my future articles, But for now think twice before adopt Microservice as Uncle Ben said  "Great power comes with Great Responsibility".

DDD::Interchange Context and Microservices

In my previous two articles, I describe lots about Bounded context and context map, where we have learned, How to segregate the domain models using Bounded context?  Also learned, How to use context map judiciously to understand the relationship between two contexts, here relationship is a broader statement it does not only represent technical relationship it also defines the relationship between owners of the services, who is in commendable position, who act as downstream etc. Also, we learned different strategy based on the relationships like (Partnership, upstream/downstream,anti-corruption layer etc).
In this article, we will be dealing with a special case when something goes wrong while designing a Microservice. Then how that error bleeds and corrupted other services, Then we will learn how to solve that problem using Interchange context.

Take an Example, In case of our online Student registration module where Registration Module is in commendable position, as it has a partnership relationship with analytics module, Payment module has a downstream relationship with it, Batch scheduler has a downstream relationship with the same.   so it is clear Registration module is in commendable position, so the language it published every one has to follow that language, by Registration module API whatever the message it publishes batch, and payment module has to receive the same message , so I can say if the Registration module publish an R language every other module has to consume that R language, So  in this system R is the defacto language standard. You can think of this situation on the real-world basis as British people conquered the world in ancient time, so the language they speak i.e English is the defacto language standard for world, A British man and an Indian automatically communicate with English not the reverse even an Indian and a Chinese also talk to each other by English (where no British people are involved).
Now, The problem occurs If the Registration Module has some issues in Design, say Boundary is not properly defined, Message structure not properly designed or Edge cases or Exceptions are not handled properly in that case in spite of a Physical boundary(Bounded context) this problems are bleed into all dependent services, as whole context map is polluted due the error in Registration module. So Registration Module now acting as a Big ball of mud. (BBoM). As per Eric Evans, he called it a Small ball of Mud. So one thing is clear if upstream/commandable system is in danger then the whole Microservice architecture become a mess. As an Architect somehow we have to prevent the bleeding of errors from upstream Microservices/ System to downstream.
If we dig through the DDD patterns we can found ACL is doing the same (Anti-corruption layer) which sits in front of consumer Microservice/System and translate the producer /upstream message structure to Consumer Microservice/System message structure also it handles any kind of errors , So ACL protecting the consumer services from External systems.  But think about a system where two services are in a Partnership relationship
they produce and consumes each other messages both dependent on each other and other services dependent on them, in that case rather than putting ACL in front of both Partners we can use a new concept called Interchange Context.
Interchange Context, is such a technique where both partners agreeing upon the model and message structure, so there is one ubiquitous language for both partners, they publish one Message together and other systems consume that message. In this way, two partners can independently change there internal code, internal message structure but in Interchange context, they create a common terminology/Ubiquitous language which understood by this two partners. Every other service, who are dependent on these two partners can consume message from Interchange context.


Conclusion : Interchange context is more robust that Anti-corruption layer, it stops the error to be bled from upstream services , If upstream services has a direct consumers then , upstream service developer should be very careful about the API , any changes in API or errors break others , so Interchange context is kind of centralized facade where partners are created the ubiquitous language and other costumes message from interchange context, partners and dependents services all are free to modify there own data model, message structure but when they communicate to interchange context internal model should converted to Interchange context message structures.