Monday, 25 December 2017

Salesforce Apex Hours : 2017



Hello Everyone !

"Salesforce Apex Hours" is a recurring event to talk about salesforce ! Some times we'd like to meet on one location and some time online. We are happy that we completed 7 successful Online event in 2017 on below topic

1) MicroServices.
2) Einstein Intent
3) Winter 18
4) Salesforce DX
5) Hyper Batch Job
6) Lightning Component Framwork
7) Live Agent


I want to say thanks to our speaker to make this Online event successful. If you missed above event still you check our online video and PPT. Here is Summery of all Event

1) MicroServices

In that session Don Robins took the session on "Mitigate with Mono-Purpose Microservices".

The flexibility built into the Salesforce platform allows developers to build incredible enterprise and mobile applications using a No Code, Low Code or High Code approach. Developers can build most of their apps with declarative tools, and code their way to success with Apex, Visualforce and Lightning Components when needed. But what’s a developer to do when the platform tools just don’t let them do what they need? In this presentation we’ll explore how to mitigate such requirements by integrating Salesforce with external Heroku micro-services to provide extended solutions that typically can't be met with Force.com capabilities.

Agenda :-
  • Microservices –WHAT, WHY, HOW
  • My Microservice – PDF Parser a practical mitigation use case
  • Sample Microservice demo and code walk thru
  • Take-aways and Links 
 http://amitsalesforce.blogspot.com/2017/12/MicroservicesSalesforce.html


2) Einstein Intent

In that session Daniel Peter (Salesforce MVP) took the session on "Einstein Intent".

Agenda :-
  • Introduction to Einstein Intent
  • Demo
  • FAQ
http://amitsalesforce.blogspot.com/2017/10/einstein-intent.html


3) Winter 18

In that session Jitendra Zaa (Salesforce MVP) took the session on "Winter 18 for Developer".

Agenda :-
  • What’s new in Winter 18 for Developer
  • Enhancements in Flow
  • Other platform improvements 

http://amitsalesforce.blogspot.com/2017/10/Winter18Salesforce.html

4) Salesforce DX

In that session Jitendra Zaa (Salesforce MVP) took the session on "Salesforce DX".

Salesforce DX provides you with an integrated, end-to-end lifecycle designed for high-performance agile development. In this session we would go through hands on and see how Salesforce DX can be used to create scratch org, automated testing and data load purpose. We would discuss CLI option as well Force.com IDE

Agenda :-
  • Introduction to Salesforce DX
  • Creating Scratch Org
  • Deploying metadata to Scratch Org
  • Creating Skeleton Workspace
  • Running Test classes
  • Getting Help
  • Using Force.com IDE with Salesforce DX
  • Q&A 

http://amitsalesforce.blogspot.com/2017/06/salesforce-apex-hours-salesforce-dx.html

5) Hyper Batch Job

In that session Daniel Peter (Salesforce MVP) took the session on "Hyper Batch Job".

Agenda :- 
  • Whats is HyperBatch
  • Difference between Old Traditional Apex Batch job and HyperBatch
  • Demo
  • FAQ  

http://amitsalesforce.blogspot.com/2017/05/salesforce-apex-hours-hyperbatch.html

6) Lightning Component Framwork

In that session "Mohith Shrivastava" took the session on "Lightning Component Framework".

Agenda :-
  • Introduction To Lightning Components
  • Difference between Lightning Components and Visualforce
  • Thinking in terms of Component Model
  • Basics of Lightning - Inside Bundle
  • Events - Discuss Application and Component Events

http://amitsalesforce.blogspot.com/2017/04/salesforce-apex-hours-lightning.html


7) Live Agent

In that session Amit Chaudhary took the session on "Live Agent".

Agenda of Session was :-
  • What is Live Agent
  • Pre and Post Chat Form
  • Demo
  • FAQ

http://amitsalesforce.blogspot.com/2017/02/salesforce-apex-hours-live-agent.html



Please follow us on below pages/link for future session :-
Facebook Page :-  https://www.facebook.com/FarmingtonHillsSfdcdug/
Meetup Link      :-  http://www.meetup.com/Farmington-Hills-Salesforce-Developer-Meetup/ 

Twitter Tag        :-  #FarmingtonHillsSFDCdug  #SalesforceApexHours
YouTube            :-  https://www.youtube.com/channel/UChTdRj6YfwqhR_WEFepkcJw/videos


Please email me if you want to become a speaker in our Online event.

Thanks,
Amit Chaudhary
@amit_sfdc
amit.salesforce21@gmail.com



Saturday, 16 December 2017

Salesforce Apex Hours:-Mitigate with Mono-Purpose Microservices



Farmington Hill Salesforce Developer group / "Salesforce Apex Hours" organized another successful Online session/event on Sat, DEC 16, 2017 , focusing on "Mitigate with Mono-Purpose Microservices".

"Salesforce Apex Hours" is a recurring event to talk about salesforce ! Some times we'd like to meet on one location and some time online. This time we are planning one online session on "Mitigate with Mono-Purpose Microservices" job with Don Robins (Salesforce MVP).  

Agenda :- 
1) Microservices –WHAT, WHY, HOW
2) My Microservice – PDF Parser a practical mitigation use case
3) Sample Microservice demo and code walk thru
4) Take-aways and Links


The flexibility built into the Salesforce platform allows developers to build incredible enterprise and mobile applications using a No Code, Low Code or High Code approach. Developers can build most of their apps with declarative tools, and code their way to success with Apex, Visualforce and Lightning Components when needed. But what’s a developer to do when the platform tools just don’t let them do what they need? In this presentation we’ll explore how to mitigate such requirements by integrating Salesforce with external Heroku micro-services to provide extended solutions that typically can't be met with Force.com capabilities.

 
Here is Session PPT

https://www.slideshare.net/AmitChaudhary112/salesforce-apex-hoursmitigate-with-monopurpose-microservices/AmitChaudhary112/salesforce-apex-hoursmitigate-with-monopurpose-microservices

Here is Session Recording






Thanks
Amit Chaudhary @amit_sfdc
Email :- amit.salesforce21@gmail.com

Wednesday, 15 November 2017

Salesforce Deployment Using ANT Migration Tool | ANT Deployment Tool

Please follow below step to configure Migration Tool ANT

Step 1) Install the Java
Please use 1.7x or later version for better security and for TLS security.



Step 2) Download the ANT

        1) Download ANT from below URL :- https://ant.apache.org/bindownload.cgi


        2) Download the zip file of Force.com Migration Tool from https://developer.salesforce.com/docs/atlas.en-us.daas.meta/daas/forcemigrationtool_install.htm

       or you can click on STEP -> Tool -> Force.com Tools and Toolkits


     Then Click on Force.com Migration Tool: and download the JAR file.

     Now We have below two zip files.


     Unzip both the files.

       3) Now Open Force.com Migration Folder (salesforce_ant_41.0 in my case). and copy "ant-salesforce.zip" File and paste in apache-ant-1.10.1\lib folder "D:\ANTTool\apache-ant-1.10.1-bin\apache-ant-1.10.1\lib" .


 
Step 3) Set the path.

Now we need to set ANT_HOME and JAVA_HOME variable which we can set from Environment variable.

ANT_HOME   :- Set the address where you installed the ANT.
JAVA_HOME  :- set the value where your JDK is located.
Path  :- Set And and Jave path till bin folder


Now All Set. Check the version



Step 4) Configure "Build.xml" and "build.properties" Files
 
To setup connection with salesforce we need to configure the "build.properties".
  • Go to folder where you unzip the Force.com  Migration Tool Files. (in my case salesforce_ant_41.0 )
  • Open the "build.properties" and update like below 
  1. # build.properties
    #

    # Specify the login credentials for the desired Salesforce organization
    sf.username = amit.salesforce21@gmail.com.demoapp
    sf.password = Password

    sf.myDevusername = amit.salesforce21@gmail.com.kt
    sf.myDevpassword = Password

    #sf.sessionId = <Insert your Salesforce session id here.  Use this or username/password above.  Cannot use both>
    #sf.pkgName = <Insert comma separated package names to be retrieved>
    #sf.zipFile = <Insert path of the zipfile to be retrieved>
    #sf.metadataType = <Insert metadata type name for which listMetadata or bulkRetrieve operations are to be performed>

    # Use 'https://login.salesforce.com' for production or developer edition (the default if not specified).
    # Use 'https://test.salesforce.com for sandbox.
    sf.serverurl = https://login.salesforce.com

    sf.maxPoll = 20
    # If your network requires an HTTP proxy, see http://ant.apache.org/manual/proxy.html for configuration.
    #




  • Now configure "build.xml" to retrievePkg and Deploy.

<project name="Sample usage of Salesforce Ant tasks" default="test" basedir="." xmlns:sf="antlib:com.salesforce">

    <property file="build.properties"/>
    <property environment="env"/>

    <!-- Setting default value for username, password and session id properties to empty string
         so unset values are treated as empty. Without this, ant expressions such as ${sf.username}
         will be treated literally.
    -->
    <condition property="sf.username" value=""> <not> <isset property="sf.username"/> </not> </condition>
    <condition property="sf.password" value=""> <not> <isset property="sf.password"/> </not> </condition>
    <condition property="sf.sessionId" value=""> <not> <isset property="sf.sessionId"/> </not> </condition>

    <taskdef resource="com/salesforce/antlib.xml" uri="antlib:com.salesforce">
        <classpath>
            <pathelement location="../ant-salesforce.jar" />           
        </classpath>
    </taskdef>
   
    <!-- Shows deploying code & running tests for code in directory -->
    <target name="deployCode">
      <!-- Upload the contents of the "codepkg" directory, running the tests for just 1 class -->
      <sf:deploy username="${sf.myDevusername}" password="${sf.myDevpassword}" sessionId="${sf.sessionId}" serverurl="${sf.serverurl}" maxPoll="${sf.maxPoll}" deployRoot="codepkg">
       </sf:deploy>
    </target>
   
    <!-- Shows retrieving code; only succeeds if done after deployCode -->
    <target name="retrieveCode">
      <!-- Retrieve the contents listed in the file codepkg/package.xml into the codepkg directory -->
      <sf:retrieve username="${sf.username}" password="${sf.password}" sessionId="${sf.sessionId}" serverurl="${sf.serverurl}" maxPoll="${sf.maxPoll}" retrieveTarget="codepkg" unpackaged="codepkg/package.xml"/>
    </target>

</project>



  • Now Setup your Package.xml file
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
    <types>
        <members>*</members>
        <name>ApexClass</name>
    </types>
    <types>
        <members>*</members>
        <name>ApexTrigger</name>
    </types>
    <version>41.0</version>
</Package>


Step 5) Now execute the command

 retrieve the Code from Source Salesforce org and execute below command

ant retrieveCode



Now Deploy the code to target Salesforce org

ant deployeCode



Thanks,
Amit Chaudhary

Sunday, 29 October 2017

Salesforce Apex Hours: Einstein Intent


Farmington Hill Salesforce Developer group / "Salesforce Apex Hours" organized another successful Online session/event on Saturday, OCT 28, 2017 , focusing on "Salesforce Apex Hours: Einstein Intent".

"Salesforce Apex Hours" is a recurring event to talk about salesforce ! Some times we'd like to meet on one location and some time online. This time we are planning one online session on "Einstein Intent " job with Daniel Peter (Salesforce MVP).  

Agenda :- 
 • Introduction to Einstein Intent
• Demo
• FAQ

Speaker : -Daniel Peter (Salesforce MVP) , Amit Chaudhary
Date :- Saturday, OCT 28, 2017 11:00 AM EST
Venue/Link :- https://www.meetup.com/Farmington-Hills-Salesforce-Developer-Meetup/events/244157201/


Here is PPT of Session

https://www.slideshare.net/AmitChaudhary112/salesforce-apex-hours-einstein-intent/1


Here is Session Recording




Thanks
Amit Chaudhary @amit_sfdc
Email :- amit.salesforce21@gmail.com

 

Sunday, 1 October 2017

Salesforce Apex Hours: What means Winter 18 for Developers



Farmington Hill Salesforce Developer group / "Salesforce Apex Hours" organized another successful Online session/event on Saturday, SEPT 30, 2017 , focusing on "Winter 18".


This time we planned one online session on "Winter 18 for Developer" with Jitendra Zaa (Salesforce MVP).

Agenda :-
  • What’s new in Winter 18 for Developer
  • Enhancements in Flow
  • Other platform improvements


Speaker : -Jitendra Zaa (Salesforce MVP),Amit Chaudhary
Date :- Saturday, Sept 30, 2017 11:00 AM EST
Venue/Link :- https://www.meetup.com/Farmington-Hills-Salesforce-Developer-Meetup/events/243446732/



Here is PPT of Session


https://www.slideshare.net/secret/mMLe4R3pVZGNd2



Here is Session Recording






Thanks
Amit Chaudhary @amit_sfdc
Email :- amit.salesforce21@gmail.com


Wednesday, 6 September 2017

Salesforce to Salesforce | S2S | How to Setup Salesforce to Salesforce



STEP 1) Enabling Salesforce to Salesforce

Enable the Salesforce to Salesforce in both the org. Follow below step to do that
Setup--> Salesforce to Salesforce --> Salesforce to Salesforce Setting.


 Then click on Edit button checked the Enable checkbox.


NOTE:- you should have "Manage Connections" access on profile level.

Step 2) Connection Setup

To setup connection you need to open "Connection" Tab. Then Click on New Tab.

Then Select Account and Contact then click on "Save & Send Invite". You also need to set Owner. Owner will get notification if any error will come.


When you will click on "Save & Send Invite" button Contact (Amit Chaudhary) will get a email like below

Then copy above URL and login by other org user.


Then other org user need to accept the invitation.

Now Connection is established but no object is shared.

Step 3) Publishing Objects

Now we need to publish or subscribe the standard or custom object. To publish the object you need to go to first Org and click on "Publish/Unpublish" button.

Then select the object and click on save.



You can select the field from Publish Object related list. Select the object and click on Object name.


Step 4) Subscribing Objects

Now got back to receiving org and click on "Subscribed Objects" related list.


 Then map the object.


Now we need to map the field. Go to "Subscribed Object" related list and select object and map field.



Now connection is setup.




Step 5) Using the Shared Connection

To share the record we have two way one is Manual and 2nd is by programming. 
  •     Manual Sharing Records
 To share record manually click go to object List View and click on "Forward To Connection"


Then select the connection and click on Save button.

Records will be automatically created in the target environment if auto-accept for the object is enabled

  •  Programmatic sharing records via Apex
    Some time we need to send record to other org automatically. Then we can create some trigger like below and do the same.

Sample Code.
trigger ShareAccountS2S on Account (After update) {
    Set <Id> setAcc = new Set<Id>();
    for(account acc : Trigger.New){
        if(acc.Active__c == 'Yes'){
            setAcc.add(acc.id);
        }
    }
 
    if(setAcc.size() >0) {
        Id NetworkId ;
        String connName ='ForBlog'; // Plz add connection Name 
        list<PartnerNetworkConnection> conns = [select id from PartnerNetworkConnection where ConnectionName = :connName ];
        if(conns.size()>0) {
            NetworkId = conns[0].id;
        }

        if( NetworkId != NULL ){
            string relatedRecords = 'Contact' ;
            list<PartnerNetworkRecordConnection> recordShares = new list<PartnerNetworkRecordConnection>();
            for(id accId : setAcc )
            {
                recordShares.add(new PartnerNetworkRecordConnection(
                    ConnectionId = networkId,
                    LocalRecordId = accId,
                    RelatedRecords = relatedRecords
                ));
            }
            insert recordShares;  
        }    
    }
}

Related Post
1) Salesforce to Salesforce Overview
2) https://developer.salesforce.com/page/Best_Practices_for_Salesforce_to_Salesforce

Thanks
Amit Chaudhary
@amit_sfdc