AZURE DATA FACTORY INTERVIEW QUESTIONS :ACTIVITIES AND TYPES.

 ACTIVITIES:-

    The Activities in a pipeline defines actions or operations to perform on your data.

TYPES:-

      1.DATA MOVEMENT ACTIVITIES.

      2.DATA TRANSFORMATION  ACTIVITIES.

      3. CONTROL FLOW ACTIVITIES.

 1.DATA MOVEMENT ACTIVITIES:-

       Copy Data :-

                 The Copy Data Activity is the core tool in Azure data factory that can move the data from Source to Sink (destination).

2.DATA TRANSFORMATION  ACTIVITIES:-

       Data flow:-

             Data flow activity is a cloud native graphical data transformation tool.

            types:-

            1.Mapping Data flow

            2.Wrangling Data flow.

         1.Mapping Data flow:-

                 *Mapping Data flows are visually designed data transforation in ADF.

                 *It helps to design the data transformation logics without writing any code.

           *inside the data flow activity we have types of data transformation like,


               under : A) multiple inputs/outputs


                               1) join transformation

                               2) conditional split transformation

                               3) exists transformation

                               4) union transformation

                               5) lookup transformation

               

               under : B) Schema modifier


                       1) derived column transformation (to generate new columns or to modify existing                                                                                             columns)

                        2) select transformation                ( to select the particular columns from  the file)

                        3) aggregate transformation          ( aggregations of column's ,we can define different                                                                                         types of  aggregations like sum ,average ,maximum,                                                                                    minimum ,count &  group by existing or computed                                                                                   columns)

                        4) surrogate key transformation  ( to add an incrementing key value to each row of data )

                        5) pivot transformation          ( to create multiple columns from the unique row value of                                                                             a  single column )

                        6) unpivot transformation        ( from multiple coiumns in a single record into multiple                                                                              records with the same values in a single column )

                        7) rank transformation           ( to generate an ordered ranking based upon sort condition                                                                          specified by the user )

                         8) window transformation         ( to partition by , to  order by )

                         9) external transformation


               under :  C) formatters

                              1) flatten transformation

                              2) parse transformation

                              3) stringify transformation


               under :  D) row modifier

                              1) filter

                               2) sort

                               3) alter row

                               4) assert

       2.Wrangling Data flow:-

                  Data Wrangling is the process of transforming and cleaning raw data into format that is                     usable  for analysis.

3. CONTROL  FLOW ACTIVITIES:-

      

Append Variable:-

                Append Variable activity could be used to add  a value to an existing array variable defined in           a Data Factory Pipeline.

Set Variable:-

                 Set Variable activity can be used to set the value of an existing variable of type String ,Bool or Array defined in a Data Factory.

Execute Pipeline:-

                 The Execute Pipeline activity allows a Data Factory pipeline to invoke another pipeline.

If Condition:-

                 If  Condition activity allows directing pipeline execution based on evaluation of certain expressions. If the expression is True  then it will execute one way , Otherwise it will execute false condition.

Get Metadata:-

                  Get Metadata can be used to retrieve the Metadata information of your files in Azure Data lake.

ForEach:-

                The ForEach activity defines a repeating (Iteration)control flow in your pipeline.

Lookup:-

             Lookup activity can retrieve a dataset from any of the Azure  Data Factory supported data sources.

Filter:-

             Filter activity can be used in a pipeline to apply a filter expression to an input array.

Until:-

              Until activity executes a set of activities in a loop until the condition associated with the activity evaluates to true.

Wait:-

             Wait activity allows  pausing (resume/stop) pipeline execution for specified time period.

Web:-

               Web activity can be used to call a customer REST endpoint from a Data Factory.

Azure Function:-

                   The Azure Function activity allows you to run Azure functions in a Data Factory. Azure functions are created in Azure function App  in Microsoft Azure.

Validation :-

                  Validation activity is used to check the specified file for every specified period. It will keep on checking for the file, If that file received then the next process will be continued Otherwise it will hold the process till that file occurs.

Stored Procedure:-

                  Stored Procedure activity used to run the stored procedure which we created in Azure SQL Database.

Delete:-

               Delete activity is used to delete the specified dataset supported by Azure Data Factory.

Script:-

            Script activity is used to run the Azure SQL Commands  like Truncate, Delete, Insert etc...









  • Comments

    Popular posts from this blog

    SQL Interview Questions:3

    AZURE DATA FACTORY INTERVIEW QUESTIONS: FILES FORMATS