[{"categories":null,"contents":"","date":"January 1, 0001","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/notes/go/basic/_index.bn/","summary":"","tags":null,"title":"Go বেসিক"},{"categories":null,"contents":" Hello World A sample go program is show here.\npackage main import \u0026#34;fmt\u0026#34; func main() { message := greetMe(\u0026#34;world\u0026#34;) fmt.Println(message) } func greetMe(name string) string { return \u0026#34;Hello, \u0026#34; + name + \u0026#34;!\u0026#34; } Run the program as below:\n$ go run hello.go Variables Normal Declaration:\nvar msg string msg = \u0026#34;Hello\u0026#34; Shortcut:\nmsg := \u0026#34;Hello\u0026#34; Constants const Phi = 1.618 ","date":"January 1, 0001","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/notes/go/basic/introduction/","summary":"\u003c!-- A Sample Program --\u003e\n\u003cdiv class=\"note-card \"\u003e\n    \u003cdiv class=\"item\"\u003e\n        \u003ch5 class=\"note-title\"\u003e\u003cspan\u003eHello World\u003c/span\u003e\u003c/h5\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cp\u003eA sample go program is show here.\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#f92672\"\u003epackage\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003emain\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#f92672\"\u003eimport\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;fmt\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003efunc\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003emain\u003c/span\u003e() {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003emessage\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003egreetMe\u003c/span\u003e(\u003cspan style=\"color:#e6db74\"\u003e\u0026#34;world\u0026#34;\u003c/span\u003e)\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003efmt\u003c/span\u003e.\u003cspan style=\"color:#a6e22e\"\u003ePrintln\u003c/span\u003e(\u003cspan style=\"color:#a6e22e\"\u003emessage\u003c/span\u003e)\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e}\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003efunc\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003egreetMe\u003c/span\u003e(\u003cspan style=\"color:#a6e22e\"\u003ename\u003c/span\u003e \u003cspan style=\"color:#66d9ef\"\u003estring\u003c/span\u003e) \u003cspan style=\"color:#66d9ef\"\u003estring\u003c/span\u003e {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#66d9ef\"\u003ereturn\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;Hello, \u0026#34;\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e+\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003ename\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e+\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;!\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e}\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cp\u003eRun the program as below:\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-bash\" data-lang=\"bash\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e$ go run hello.go\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003c/div\u003e\n            \u003c/div\u003e\n        \n    \u003c/div\u003e\n\u003c/div\u003e\n\n\u003c!-- Declaring Variables --\u003e\n\u003cdiv class=\"note-card \"\u003e\n    \u003cdiv class=\"item\"\u003e\n        \u003ch5 class=\"note-title\"\u003e\u003cspan\u003eVariables\u003c/span\u003e\u003c/h5\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cp\u003e\u003cstrong\u003eNormal Declaration:\u003c/strong\u003e\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003evar\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003emsg\u003c/span\u003e \u003cspan style=\"color:#66d9ef\"\u003estring\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#a6e22e\"\u003emsg\u003c/span\u003e = \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;Hello\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003c/div\u003e\n            \u003c/div\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cp\u003e\u003cstrong\u003eShortcut:\u003c/strong\u003e\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#a6e22e\"\u003emsg\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;Hello\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003c/div\u003e\n            \u003c/div\u003e\n        \n    \u003c/div\u003e\n\u003c/div\u003e\n\n\u003c!-- Declaring Constants --\u003e\n\u003cdiv class=\"note-card \"\u003e\n    \u003cdiv class=\"item\"\u003e\n        \u003ch5 class=\"note-title\"\u003e\u003cspan\u003eConstants\u003c/span\u003e\u003c/h5\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003econst\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003ePhi\u003c/span\u003e = \u003cspan style=\"color:#ae81ff\"\u003e1.618\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003c/div\u003e\n            \u003c/div\u003e\n        \n    \u003c/div\u003e\n\u003c/div\u003e","tags":null,"title":"Introduction"},{"categories":null,"contents":" Strings str := \u0026#34;Hello\u0026#34; Multiline string\nstr := `Multiline string` Numbers Typical types\nnum := 3 // int num := 3. // float64 num := 3 + 4i // complex128 num := byte(\u0026#39;a\u0026#39;) // byte (alias for uint8) Other Types\nvar u uint = 7 // uint (unsigned) var p float32 = 22.7 // 32-bit float Arrays // var numbers [5]int numbers := [...]int{0, 0, 0, 0, 0} Pointers func main () { b := *getPointer() fmt.Println(\u0026#34;Value is\u0026#34;, b) func getPointer () (myPointer *int) { a := 234 return \u0026amp;a a := new(int) *a = 234 Pointers point to a memory location of a variable. Go is fully garbage-collected.\nType Conversion i := 2 f := float64(i) u := uint(i) Slice slice := []int{2, 3, 4} slice := []byte(\u0026#34;Hello\u0026#34;) ","date":"January 1, 0001","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/notes/go/basic/types/","summary":"\u003c!-- String Type --\u003e\n\u003cdiv class=\"note-card \"\u003e\n    \u003cdiv class=\"item\"\u003e\n        \u003ch5 class=\"note-title\"\u003e\u003cspan\u003eStrings\u003c/span\u003e\u003c/h5\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#a6e22e\"\u003estr\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;Hello\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cp\u003eMultiline string\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#a6e22e\"\u003estr\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e`Multiline\n\u003c/span\u003e\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#e6db74\"\u003estring`\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003c/div\u003e\n            \u003c/div\u003e\n        \n    \u003c/div\u003e\n\u003c/div\u003e\n\n\u003c!-- Number Types --\u003e\n\u003cdiv class=\"note-card \"\u003e\n    \u003cdiv class=\"item\"\u003e\n        \u003ch5 class=\"note-title\"\u003e\u003cspan\u003eNumbers\u003c/span\u003e\u003c/h5\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cp\u003eTypical types\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#a6e22e\"\u003enum\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#ae81ff\"\u003e3\u003c/span\u003e          \u003cspan style=\"color:#75715e\"\u003e// int\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#a6e22e\"\u003enum\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#ae81ff\"\u003e3.\u003c/span\u003e         \u003cspan style=\"color:#75715e\"\u003e// float64\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#a6e22e\"\u003enum\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#ae81ff\"\u003e3\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e+\u003c/span\u003e \u003cspan style=\"color:#ae81ff\"\u003e4i\u003c/span\u003e     \u003cspan style=\"color:#75715e\"\u003e// complex128\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#a6e22e\"\u003enum\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e byte(\u003cspan style=\"color:#e6db74\"\u003e\u0026#39;a\u0026#39;\u003c/span\u003e)  \u003cspan style=\"color:#75715e\"\u003e// byte (alias for uint8)\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cp\u003eOther Types\u003c/p\u003e\n\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003evar\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003eu\u003c/span\u003e \u003cspan style=\"color:#66d9ef\"\u003euint\u003c/span\u003e = \u003cspan style=\"color:#ae81ff\"\u003e7\u003c/span\u003e        \u003cspan style=\"color:#75715e\"\u003e// uint (unsigned)\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003evar\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003ep\u003c/span\u003e \u003cspan style=\"color:#66d9ef\"\u003efloat32\u003c/span\u003e = \u003cspan style=\"color:#ae81ff\"\u003e22.7\u003c/span\u003e  \u003cspan style=\"color:#75715e\"\u003e// 32-bit float\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003c/div\u003e\n            \u003c/div\u003e\n        \n    \u003c/div\u003e\n\u003c/div\u003e\n\n\u003c!----------- Arrays  ------\u003e\n\u003cdiv class=\"note-card \"\u003e\n    \u003cdiv class=\"item\"\u003e\n        \u003ch5 class=\"note-title\"\u003e\u003cspan\u003eArrays\u003c/span\u003e\u003c/h5\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#75715e\"\u003e// var numbers [5]int\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#a6e22e\"\u003enumbers\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e [\u003cspan style=\"color:#f92672\"\u003e...\u003c/span\u003e]\u003cspan style=\"color:#66d9ef\"\u003eint\u003c/span\u003e{\u003cspan style=\"color:#ae81ff\"\u003e0\u003c/span\u003e, \u003cspan style=\"color:#ae81ff\"\u003e0\u003c/span\u003e, \u003cspan style=\"color:#ae81ff\"\u003e0\u003c/span\u003e, \u003cspan style=\"color:#ae81ff\"\u003e0\u003c/span\u003e, \u003cspan style=\"color:#ae81ff\"\u003e0\u003c/span\u003e}\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003c/div\u003e\n            \u003c/div\u003e\n        \n    \u003c/div\u003e\n\u003c/div\u003e\n\n\u003c!-- Pointers --\u003e\n\u003cdiv class=\"note-card medium-note\"\u003e\n    \u003cdiv class=\"item\"\u003e\n        \u003ch5 class=\"note-title\"\u003e\u003cspan\u003ePointers\u003c/span\u003e\u003c/h5\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003efunc\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003emain\u003c/span\u003e () {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003eb\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e*\u003c/span\u003e\u003cspan style=\"color:#a6e22e\"\u003egetPointer\u003c/span\u003e()\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003efmt\u003c/span\u003e.\u003cspan style=\"color:#a6e22e\"\u003ePrintln\u003c/span\u003e(\u003cspan style=\"color:#e6db74\"\u003e\u0026#34;Value is\u0026#34;\u003c/span\u003e, \u003cspan style=\"color:#a6e22e\"\u003eb\u003c/span\u003e)\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003efunc\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003egetPointer\u003c/span\u003e () (\u003cspan style=\"color:#a6e22e\"\u003emyPointer\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e*\u003c/span\u003e\u003cspan style=\"color:#66d9ef\"\u003eint\u003c/span\u003e) {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003ea\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#ae81ff\"\u003e234\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#66d9ef\"\u003ereturn\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e\u0026amp;\u003c/span\u003e\u003cspan style=\"color:#a6e22e\"\u003ea\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#a6e22e\"\u003ea\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e new(\u003cspan style=\"color:#66d9ef\"\u003eint\u003c/span\u003e)\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#f92672\"\u003e*\u003c/span\u003e\u003cspan style=\"color:#a6e22e\"\u003ea\u003c/span\u003e = \u003cspan style=\"color:#ae81ff\"\u003e234\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cp\u003ePointers point to a memory location of a variable. Go is fully garbage-collected.\u003c/p\u003e","tags":null,"title":"Basic Types"},{"categories":null,"contents":"","date":"January 1, 0001","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/notes/go/advanced/_index.bn/","summary":"","tags":null,"title":"অ্যাডভান্সড"},{"categories":null,"contents":" Condition if day == \u0026#34;sunday\u0026#34; || day == \u0026#34;saturday\u0026#34; { rest() } else if day == \u0026#34;monday\u0026#34; \u0026amp;\u0026amp; isTired() { groan() } else { work() } if _, err := doThing(); err != nil { fmt.Println(\u0026#34;Uh oh\u0026#34;) Switch switch day { case \u0026#34;sunday\u0026#34;: // cases don\u0026#39;t \u0026#34;fall through\u0026#34; by default! fallthrough case \u0026#34;saturday\u0026#34;: rest() default: work() } Loop for count := 0; count \u0026lt;= 10; count++ { fmt.Println(\u0026#34;My counter is at\u0026#34;, count) } entry := []string{\u0026#34;Jack\u0026#34;,\u0026#34;John\u0026#34;,\u0026#34;Jones\u0026#34;} for i, val := range entry { fmt.Printf(\u0026#34;At position %d, the character %s is present\\n\u0026#34;, i, val) n := 0 x := 42 for n != x { n := guess() } ","date":"January 1, 0001","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/notes/go/basic/flow-control/","summary":"\u003c!-- Condition --\u003e\n\u003cdiv class=\"note-card \"\u003e\n    \u003cdiv class=\"item\"\u003e\n        \u003ch5 class=\"note-title\"\u003e\u003cspan\u003eCondition\u003c/span\u003e\u003c/h5\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003eif\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003eday\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e==\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;sunday\u0026#34;\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e||\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003eday\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e==\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;saturday\u0026#34;\u003c/span\u003e {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003erest\u003c/span\u003e()\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e} \u003cspan style=\"color:#66d9ef\"\u003eelse\u003c/span\u003e \u003cspan style=\"color:#66d9ef\"\u003eif\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003eday\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e==\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;monday\u0026#34;\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e\u0026amp;\u0026amp;\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003eisTired\u003c/span\u003e() {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003egroan\u003c/span\u003e()\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e} \u003cspan style=\"color:#66d9ef\"\u003eelse\u003c/span\u003e {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003ework\u003c/span\u003e()\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e}\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003eif\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003e_\u003c/span\u003e, \u003cspan style=\"color:#a6e22e\"\u003eerr\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003edoThing\u003c/span\u003e(); \u003cspan style=\"color:#a6e22e\"\u003eerr\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e!=\u003c/span\u003e \u003cspan style=\"color:#66d9ef\"\u003enil\u003c/span\u003e {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003efmt\u003c/span\u003e.\u003cspan style=\"color:#a6e22e\"\u003ePrintln\u003c/span\u003e(\u003cspan style=\"color:#e6db74\"\u003e\u0026#34;Uh oh\u0026#34;\u003c/span\u003e)\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003c/div\u003e\n            \u003c/div\u003e\n        \n    \u003c/div\u003e\n\u003c/div\u003e\n\n\u003c!-- Switch --\u003e\n\u003cdiv class=\"note-card \"\u003e\n    \u003cdiv class=\"item\"\u003e\n        \u003ch5 class=\"note-title\"\u003e\u003cspan\u003eSwitch\u003c/span\u003e\u003c/h5\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003eswitch\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003eday\u003c/span\u003e {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#66d9ef\"\u003ecase\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;sunday\u0026#34;\u003c/span\u003e:\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e    \u003cspan style=\"color:#75715e\"\u003e// cases don\u0026#39;t \u0026#34;fall through\u0026#34; by default!\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e    \u003cspan style=\"color:#66d9ef\"\u003efallthrough\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#66d9ef\"\u003ecase\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;saturday\u0026#34;\u003c/span\u003e:\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e    \u003cspan style=\"color:#a6e22e\"\u003erest\u003c/span\u003e()\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#66d9ef\"\u003edefault\u003c/span\u003e:\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e    \u003cspan style=\"color:#a6e22e\"\u003ework\u003c/span\u003e()\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e}\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003c/div\u003e\n            \u003c/div\u003e\n        \n    \u003c/div\u003e\n\u003c/div\u003e\n\n\u003c!-- Loop --\u003e\n\u003cdiv class=\"note-card \"\u003e\n    \u003cdiv class=\"item\"\u003e\n        \u003ch5 class=\"note-title\"\u003e\u003cspan\u003eLoop\u003c/span\u003e\u003c/h5\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003efor\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003ecount\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#ae81ff\"\u003e0\u003c/span\u003e; \u003cspan style=\"color:#a6e22e\"\u003ecount\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e\u0026lt;=\u003c/span\u003e \u003cspan style=\"color:#ae81ff\"\u003e10\u003c/span\u003e; \u003cspan style=\"color:#a6e22e\"\u003ecount\u003c/span\u003e\u003cspan style=\"color:#f92672\"\u003e++\u003c/span\u003e {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003efmt\u003c/span\u003e.\u003cspan style=\"color:#a6e22e\"\u003ePrintln\u003c/span\u003e(\u003cspan style=\"color:#e6db74\"\u003e\u0026#34;My counter is at\u0026#34;\u003c/span\u003e, \u003cspan style=\"color:#a6e22e\"\u003ecount\u003c/span\u003e)\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e}\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#a6e22e\"\u003eentry\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e []\u003cspan style=\"color:#66d9ef\"\u003estring\u003c/span\u003e{\u003cspan style=\"color:#e6db74\"\u003e\u0026#34;Jack\u0026#34;\u003c/span\u003e,\u003cspan style=\"color:#e6db74\"\u003e\u0026#34;John\u0026#34;\u003c/span\u003e,\u003cspan style=\"color:#e6db74\"\u003e\u0026#34;Jones\u0026#34;\u003c/span\u003e}\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003efor\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003ei\u003c/span\u003e, \u003cspan style=\"color:#a6e22e\"\u003eval\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#66d9ef\"\u003erange\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003eentry\u003c/span\u003e {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003efmt\u003c/span\u003e.\u003cspan style=\"color:#a6e22e\"\u003ePrintf\u003c/span\u003e(\u003cspan style=\"color:#e6db74\"\u003e\u0026#34;At position %d, the character %s is present\\n\u0026#34;\u003c/span\u003e, \u003cspan style=\"color:#a6e22e\"\u003ei\u003c/span\u003e, \u003cspan style=\"color:#a6e22e\"\u003eval\u003c/span\u003e)\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#a6e22e\"\u003en\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#ae81ff\"\u003e0\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#a6e22e\"\u003ex\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#ae81ff\"\u003e42\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003efor\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003en\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e!=\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003ex\u003c/span\u003e {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003en\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003eguess\u003c/span\u003e()\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e}\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003c/div\u003e\n            \u003c/div\u003e\n        \n    \u003c/div\u003e\n\u003c/div\u003e","tags":null,"title":"Flow Control"},{"categories":null,"contents":" Condition if day == \u0026#34;sunday\u0026#34; || day == \u0026#34;saturday\u0026#34; { rest() } else if day == \u0026#34;monday\u0026#34; \u0026amp;\u0026amp; isTired() { groan() } else { work() } if _, err := doThing(); err != nil { fmt.Println(\u0026#34;Uh oh\u0026#34;) ","date":"January 1, 0001","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/notes/go/advanced/files/","summary":"\u003c!-- Condition --\u003e\n\u003cdiv class=\"note-card \"\u003e\n    \u003cdiv class=\"item\"\u003e\n        \u003ch5 class=\"note-title\"\u003e\u003cspan\u003eCondition\u003c/span\u003e\u003c/h5\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003eif\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003eday\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e==\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;sunday\u0026#34;\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e||\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003eday\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e==\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;saturday\u0026#34;\u003c/span\u003e {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003erest\u003c/span\u003e()\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e} \u003cspan style=\"color:#66d9ef\"\u003eelse\u003c/span\u003e \u003cspan style=\"color:#66d9ef\"\u003eif\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003eday\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e==\u003c/span\u003e \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;monday\u0026#34;\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e\u0026amp;\u0026amp;\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003eisTired\u003c/span\u003e() {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003egroan\u003c/span\u003e()\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e} \u003cspan style=\"color:#66d9ef\"\u003eelse\u003c/span\u003e {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003ework\u003c/span\u003e()\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e}\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-go\" data-lang=\"go\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003eif\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003e_\u003c/span\u003e, \u003cspan style=\"color:#a6e22e\"\u003eerr\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e:=\u003c/span\u003e \u003cspan style=\"color:#a6e22e\"\u003edoThing\u003c/span\u003e(); \u003cspan style=\"color:#a6e22e\"\u003eerr\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e!=\u003c/span\u003e \u003cspan style=\"color:#66d9ef\"\u003enil\u003c/span\u003e {\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  \u003cspan style=\"color:#a6e22e\"\u003efmt\u003c/span\u003e.\u003cspan style=\"color:#a6e22e\"\u003ePrintln\u003c/span\u003e(\u003cspan style=\"color:#e6db74\"\u003e\u0026#34;Uh oh\u0026#34;\u003c/span\u003e)\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003c/div\u003e\n            \u003c/div\u003e\n        \n    \u003c/div\u003e\n\u003c/div\u003e","tags":null,"title":"File Manipulation"},{"categories":null,"contents":" Variable NAME=\u0026#34;John\u0026#34; echo $NAME echo \u0026#34;$NAME\u0026#34; echo \u0026#34;${NAME} Condition if [[ -z \u0026#34;$string\u0026#34; ]]; then echo \u0026#34;String is empty\u0026#34; elif [[ -n \u0026#34;$string\u0026#34; ]]; then echo \u0026#34;String is not empty\u0026#34; fi ","date":"January 1, 0001","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/notes/bash/basic/","summary":"\u003c!-- Variable --\u003e\n\u003cdiv class=\"note-card \"\u003e\n    \u003cdiv class=\"item\"\u003e\n        \u003ch5 class=\"note-title\"\u003e\u003cspan\u003eVariable\u003c/span\u003e\u003c/h5\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-bash\" data-lang=\"bash\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003eNAME\u003cspan style=\"color:#f92672\"\u003e=\u003c/span\u003e\u003cspan style=\"color:#e6db74\"\u003e\u0026#34;John\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003eecho $NAME\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003eecho \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;\u003c/span\u003e$NAME\u003cspan style=\"color:#e6db74\"\u003e\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003eecho \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;\u003c/span\u003e\u003cspan style=\"color:#e6db74\"\u003e${\u003c/span\u003eNAME\u003cspan style=\"color:#e6db74\"\u003e}\u003c/span\u003e\u003cspan style=\"color:#e6db74\"\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003c/div\u003e\n            \u003c/div\u003e\n        \n    \u003c/div\u003e\n\u003c/div\u003e\n\n\u003c!-- Condition --\u003e\n\u003cdiv class=\"note-card \"\u003e\n    \u003cdiv class=\"item\"\u003e\n        \u003ch5 class=\"note-title\"\u003e\u003cspan\u003eCondition\u003c/span\u003e\u003c/h5\u003e\n        \n            \u003cdiv class=\"card\"\u003e\n                \u003cdiv class=\"card-body\"\u003e\u003cdiv class=\"highlight\"\u003e\u003cpre tabindex=\"0\" style=\"color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;\"\u003e\u003ccode class=\"language-bash\" data-lang=\"bash\"\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003eif\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e[[\u003c/span\u003e -z \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;\u003c/span\u003e$string\u003cspan style=\"color:#e6db74\"\u003e\u0026#34;\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e]]\u003c/span\u003e; \u003cspan style=\"color:#66d9ef\"\u003ethen\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  echo \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;String is empty\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003eelif\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e[[\u003c/span\u003e -n \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;\u003c/span\u003e$string\u003cspan style=\"color:#e6db74\"\u003e\u0026#34;\u003c/span\u003e \u003cspan style=\"color:#f92672\"\u003e]]\u003c/span\u003e; \u003cspan style=\"color:#66d9ef\"\u003ethen\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e  echo \u003cspan style=\"color:#e6db74\"\u003e\u0026#34;String is not empty\u0026#34;\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003cspan style=\"display:flex;\"\u003e\u003cspan\u003e\u003cspan style=\"color:#66d9ef\"\u003efi\u003c/span\u003e\n\u003c/span\u003e\u003c/span\u003e\u003c/code\u003e\u003c/pre\u003e\u003c/div\u003e\u003c/div\u003e\n            \u003c/div\u003e\n        \n    \u003c/div\u003e\n\u003c/div\u003e","tags":null,"title":"Bash Variables"},{"categories":null,"contents":"About Tech Tweedie Hello, I\u0026rsquo;m Ian Tweedie, and welcome to my personal technical blog, Tech Tweedie.\nI am a passionate technologist with a deep focus on the Microsoft Power Platform and Azure. This site is where I share my knowledge, experience, and insights gained from years of working with these powerful technologies.\nMy Mission The purpose of this site is to provide high-quality, practical, and real-world technical content for developers, architects, and consultants working with the Power Platform. I aim to go beyond the basics, offering deep dives into complex topics, solutions to common challenges, and expert commentary on the future of low-code and pro-code development.\nMy Expertise My professional life revolves around designing and building enterprise-scale solutions. My areas of expertise include:\nPower Platform Governance \u0026amp; ALM: Establishing robust governance frameworks and Application Lifecycle Management (ALM) strategies for enterprise clients. Pro-Code Development: Extending the Power Platform with custom code, including PCF controls, Azure Functions, and API integrations. Solution Architecture: Designing scalable, secure, and maintainable solutions that solve real business problems. Community Leadership: I am an active member of the Power Platform community, frequently speaking at user groups and conferences. This site is run by my company, Tweed Technology Ltd.\nWhy Trust This Site? Every article, tutorial, and guide on this site is written by me, based on my direct experience. I am committed to accuracy, clarity, and providing value to my readers. This is not a corporate blog; it\u0026rsquo;s a platform for sharing authentic, expert-led content from a practitioner who is in the trenches every day.\nThank you for visiting, and I hope you find the content here useful and inspiring.\n","date":"April 5, 2026","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/about/","summary":"\u003ch2 id=\"about-tech-tweedie\"\u003eAbout Tech Tweedie\u003c/h2\u003e\n\u003cp\u003eHello, I\u0026rsquo;m \u003cstrong\u003eIan Tweedie\u003c/strong\u003e, and welcome to my personal technical blog, Tech Tweedie.\u003c/p\u003e\n\u003cp\u003eI am a passionate technologist with a deep focus on the Microsoft Power Platform and Azure. This site is where I share my knowledge, experience, and insights gained from years of working with these powerful technologies.\u003c/p\u003e\n\u003ch3 id=\"my-mission\"\u003eMy Mission\u003c/h3\u003e\n\u003cp\u003eThe purpose of this site is to provide high-quality, practical, and real-world technical content for developers, architects, and consultants working with the Power Platform. I aim to go beyond the basics, offering deep dives into complex topics, solutions to common challenges, and expert commentary on the future of low-code and pro-code development.\u003c/p\u003e","tags":null,"title":"About Ian Tweedie"},{"categories":null,"contents":"Contact Us Thank you for your interest in Tech Tweedie. Whether you have a question, feedback, or a data protection inquiry, I\u0026rsquo;d love to hear from you.\nGeneral \u0026amp; Technical Inquiries For questions about articles, technical topics, or speaking engagements, the best way to reach me is via social media or email.\nEmail: hello@tweed.technology LinkedIn: https://www.linkedin.com/in/ian-tweedie/ Twitter/X: @iTweedie Data Protection \u0026amp; Privacy Inquiries For any questions related to our Privacy Policy, Cookie Policy, or your personal data, please email our dedicated privacy contact.\nPrivacy Email: privacy@tweed.technology This site is operated by:\nTweed Technology Ltd\nCompany Number: 07820713\nUnited Kingdom\n","date":"April 5, 2026","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/contact/","summary":"\u003ch2 id=\"contact-us\"\u003eContact Us\u003c/h2\u003e\n\u003cp\u003eThank you for your interest in Tech Tweedie. Whether you have a question, feedback, or a data protection inquiry, I\u0026rsquo;d love to hear from you.\u003c/p\u003e\n\u003ch3 id=\"general--technical-inquiries\"\u003eGeneral \u0026amp; Technical Inquiries\u003c/h3\u003e\n\u003cp\u003eFor questions about articles, technical topics, or speaking engagements, the best way to reach me is via social media or email.\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003e\u003cstrong\u003eEmail:\u003c/strong\u003e \u003ca href=\"mailto:hello@tweed.technology\"\u003ehello@tweed.technology\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eLinkedIn:\u003c/strong\u003e \u003ca href=\"https://www.linkedin.com/in/iantweedie/\" target=\"_blank\" rel=\"noopener\"\u003ehttps://www.linkedin.com/in/ian-tweedie/\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eTwitter/X:\u003c/strong\u003e \u003ca href=\"https://twitter.com/iTweedie\" target=\"_blank\" rel=\"noopener\"\u003e@iTweedie\u003c/a\u003e\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch3 id=\"data-protection--privacy-inquiries\"\u003eData Protection \u0026amp; Privacy Inquiries\u003c/h3\u003e\n\u003cp\u003eFor any questions related to our Privacy Policy, Cookie Policy, or your personal data, please email our dedicated privacy contact.\u003c/p\u003e","tags":null,"title":"Contact"},{"categories":null,"contents":"Cookie Policy Last updated: 5 April 2026\nThis Cookie Policy explains how Tech Tweedie (\u0026ldquo;we\u0026rdquo;, \u0026ldquo;us\u0026rdquo;, and \u0026ldquo;our\u0026rdquo;) uses cookies and similar technologies to recognise you when you visit our website at https://techtweedie.github.io. It explains what these technologies are and why we use them, as well as your rights to control our use of them.\n1. What are cookies? A cookie is a small text file that a website saves on your computer or mobile device when you visit the site. It enables the website to remember your actions and preferences (such as login, language, font size, and other display preferences) over a period of time, so you don’t have to keep re-entering them whenever you come back to the site or browse from one page to another.\n2. Why do we use cookies? We use cookies for several reasons. Some cookies are required for technical reasons in order for our website to operate, and we refer to these as \u0026ldquo;essential\u0026rdquo; or \u0026ldquo;strictly necessary\u0026rdquo; cookies. Other cookies enable us to track and target the interests of our users to enhance the experience on our website. Third parties serve cookies through our website for advertising, analytics, and other purposes.\n3. Types of Cookies We Use a. Essential Cookies These cookies are strictly necessary to provide you with services available through our website and to use some of its features, such as access to secure areas. Because these cookies are strictly necessary to deliver the website, you cannot refuse them without impacting how our website functions.\nb. Analytics and Performance Cookies These cookies are used to collect information about traffic to our website and how users use the website. The information gathered does not identify any individual visitor. We use this information to help us understand how our website is being used and how we can improve your experience. We use Google Analytics for this purpose.\nc. Advertising Cookies This website uses Google AdSense to display advertisements. These cookies are used to make advertising messages more relevant to you. They perform functions like preventing the same ad from continuously reappearing, ensuring that ads are properly displayed for advertisers, and in some cases selecting advertisements that are based on your interests.\nThird-party vendors, including Google, use cookies to serve ads based on your prior visits to this website. Google\u0026rsquo;s use of the DoubleClick cookie enables it and its partners to serve ads to our users based on their visit to our sites and/or other sites on the Internet.\n4. Your Consent When you first visit our site, we will show you a cookie banner requesting your consent to set non-essential cookies (Analytics and Advertising). By clicking \u0026ldquo;Accept all\u0026rdquo;, you agree to the placement of these cookies. If you click \u0026ldquo;Reject non-essential\u0026rdquo;, we will only use essential cookies.\n5. How to Manage Cookies You have the right to decide whether to accept or reject cookies.\nCookie Banner: You can exercise your cookie preferences via the cookie consent banner that appears on your first visit. Browser Controls: You can set or amend your web browser controls to accept or refuse cookies. If you choose to reject cookies, you may still use our website though your access to some functionality and areas of our website may be restricted. As the means by which you can refuse cookies through your web browser controls vary from browser-to-browser, you should visit your browser\u0026rsquo;s help menu for more information. 6. More Information For more information about our privacy practices, please review our Privacy Policy.\nIf you have any questions about our use of cookies, please contact us.\n","date":"April 5, 2026","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/cookie-policy/","summary":"\u003ch2 id=\"cookie-policy\"\u003eCookie Policy\u003c/h2\u003e\n\u003cp\u003e\u003cstrong\u003eLast updated: 5 April 2026\u003c/strong\u003e\u003c/p\u003e\n\u003cp\u003eThis Cookie Policy explains how Tech Tweedie (\u0026ldquo;we\u0026rdquo;, \u0026ldquo;us\u0026rdquo;, and \u0026ldquo;our\u0026rdquo;) uses cookies and similar technologies to recognise you when you visit our website at \u003ca href=\"https://techtweedie.github.io\" target=\"_blank\" rel=\"noopener\"\u003ehttps://techtweedie.github.io\u003c/a\u003e. It explains what these technologies are and why we use them, as well as your rights to control our use of them.\u003c/p\u003e\n\u003chr\u003e\n\u003ch3 id=\"1-what-are-cookies\"\u003e1. What are cookies?\u003c/h3\u003e\n\u003cp\u003eA cookie is a small text file that a website saves on your computer or mobile device when you visit the site. It enables the website to remember your actions and preferences (such as login, language, font size, and other display preferences) over a period of time, so you don’t have to keep re-entering them whenever you come back to the site or browse from one page to another.\u003c/p\u003e","tags":null,"title":"Cookie Policy"},{"categories":null,"contents":"Privacy Policy Last updated: 5 April 2026\nThis website, Tech Tweedie (accessible at https://techtweedie.github.io), is operated by:\nTweed Technology Ltd\nCompany Number: 07820713\nCorporate website: https://tweed.technology/\nTweed Technology Ltd is the data controller responsible for your personal data under the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018.\nWe are committed to protecting your privacy. This policy explains what personal data we collect, how we use it, and your rights concerning that data.\n1. What Information We Collect We collect information to provide and improve our service. The types of data we collect depend on your interaction with our site.\nTechnical Data: When you visit, we may automatically collect your IP address, browser type, operating system, and referring URLs. This is standard practice for most websites and is used for security and troubleshooting. Usage Data: We collect information about how you use our website, such as pages visited, time spent on pages, and links clicked. We use this to understand what content is popular and to improve the site. Cookie Data: We use cookies for analytics and advertising. Please see our Cookie Policy for detailed information. Contact Data: If you contact us directly via email, we will have your email address and any information you provide in your message. 2. How We Use Your Information We use your data for the following purposes:\nTo operate and maintain our website: Ensuring the site is secure and performs correctly. To improve our website: Analysing usage data helps us refine our content and user experience. To comply with legal obligations: We may need to process your data to comply with legal or regulatory requirements. To serve advertisements: We use Google AdSense to display ads, which helps fund the site. 3. Google AdSense \u0026amp; Advertising This site uses Google AdSense, an advertising service provided by Google.\nThird-party vendors, including Google, use cookies to serve ads based on a user\u0026rsquo;s prior visits to this website or other websites. Google\u0026rsquo;s use of advertising cookies (such as the DoubleClick cookie) enables it and its partners to serve ads to you based on your visit to this site and/or other sites on the Internet. These ads may be personalised based on your interests, demographics, and other information Google has collected. In some regions, you may only be shown non-personalised ads. How to Control Advertising Cookies:\nYou can opt out of personalised advertising by visiting Google\u0026rsquo;s Ad Settings. Alternatively, you can opt out of a third-party vendor\u0026rsquo;s use of cookies for personalised advertising by visiting www.aboutads.info.\n4. Lawful Basis for Processing Under UK GDPR, we process your personal data on the following lawful bases:\nConsent: For non-essential cookies (analytics and advertising) and when you contact us. You can withdraw your consent at any time. Legitimate Interests: For essential website operation, security, and analysing aggregated, anonymised usage data to improve our services. We ensure our legitimate interests do not override your rights and freedoms. 5. Data Retention We retain personal data only for as long as necessary to fulfil the purposes we collected it for, including for the purposes of satisfying any legal, accounting, or reporting requirements.\nContact Data: Retained for as long as necessary to resolve your query and for a reasonable period thereafter for record-keeping. Analytics Data: Retained in an aggregated, anonymised form indefinitely. 6. Your Data Protection Rights Under UK GDPR, you have the following rights:\nThe right to be informed: To be told how your personal information will be used. The right of access: To request a copy of the personal data we hold about you. The right to rectification: To request that we correct any inaccurate personal data. The right to erasure: To request that we delete your personal data, where there is no compelling reason for us to keep it. The right to restrict processing: To request that we suspend the processing of your personal data. The right to data portability: To request a copy of your data in a machine-readable format. The right to object: To object to us processing your data (for example, for direct marketing). Rights in relation to automated decision making and profiling. To exercise any of these rights, please contact us.\n7. Data Security We have implemented appropriate security measures to prevent your personal data from being accidentally lost, used, or accessed in an unauthorised way.\n8. Contact Us For any questions about this privacy policy or our data protection practices, please contact us at:\n[NOTE FOR SITE OWNER: Please insert a valid email address here, e.g., privacy@yourdomain.com]\n9. Complaints You have the right to make a complaint at any time to the Information Commissioner\u0026rsquo;s Office (ICO), the UK supervisory authority for data protection issues (www.ico.org.uk). We would, however, appreciate the chance to deal with your concerns before you approach the ICO, so please contact us in the first instance.\n","date":"April 5, 2026","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/privacy-policy/","summary":"\u003ch2 id=\"privacy-policy\"\u003ePrivacy Policy\u003c/h2\u003e\n\u003cp\u003e\u003cstrong\u003eLast updated: 5 April 2026\u003c/strong\u003e\u003c/p\u003e\n\u003cp\u003eThis website, \u003cstrong\u003eTech Tweedie\u003c/strong\u003e (accessible at \u003ca href=\"https://techtweedie.github.io\" target=\"_blank\" rel=\"noopener\"\u003ehttps://techtweedie.github.io\u003c/a\u003e), is operated by:\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eTweed Technology Ltd\u003c/strong\u003e\u003cbr\u003e\nCompany Number: \u003cstrong\u003e07820713\u003c/strong\u003e\u003cbr\u003e\nCorporate website: \u003ca href=\"https://tweed.technology/\" target=\"_blank\" rel=\"noopener\"\u003ehttps://tweed.technology/\u003c/a\u003e\u003c/p\u003e\n\u003cp\u003eTweed Technology Ltd is the data controller responsible for your personal data under the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018.\u003c/p\u003e\n\u003cp\u003eWe are committed to protecting your privacy. This policy explains what personal data we collect, how we use it, and your rights concerning that data.\u003c/p\u003e","tags":null,"title":"Privacy Policy"},{"categories":null,"contents":"Terms of Use Last updated: 5 April 2026\nPlease read these Terms of Use (\u0026ldquo;Terms\u0026rdquo;) carefully before using the Tech Tweedie website (https://techtweedie.github.io) operated by Tweed Technology Ltd.\nYour access to and use of the website is conditioned on your acceptance of and compliance with these Terms. These Terms apply to all visitors, users, and others who access or use the website.\nBy accessing or using the website, you agree to be bound by these Terms. If you disagree with any part of the terms, then you may not access the website.\n1. Acceptable Use You agree not to use the website in any way that causes, or may cause, damage to the website or impairment of the availability or accessibility of the website; or in any way which is unlawful, illegal, fraudulent, or harmful.\nYou must not use this website to copy, store, host, transmit, send, use, publish, or distribute any material which consists of (or is linked to) any spyware, computer virus, Trojan horse, worm, keystroke logger, rootkit, or other malicious computer software.\n2. Intellectual Property Unless otherwise stated, Tweed Technology Ltd and/or its licensors own the intellectual property rights for all material on Tech Tweedie. All intellectual property rights are reserved.\nYou may view and/or print pages from https://techtweedie.github.io for your own personal use subject to restrictions set in these terms and conditions.\nYou must not:\nRepublish material from this website Sell, rent, or sub-license material from this website Reproduce, duplicate, or copy material from this website 3. Disclaimer of Warranties The content on this website is provided for general information and educational purposes only. It is not intended to be, and should not be used as, a substitute for professional advice.\nWhile we strive to provide accurate and up-to-date information, we make no representations or warranties of any kind, express or implied, about the completeness, accuracy, reliability, suitability, or availability with respect to the website or the information, products, services, or related graphics contained on the website for any purpose. Any reliance you place on such information is therefore strictly at your own risk.\nThe technical advice and code samples provided are for demonstration purposes. You are responsible for testing and validating them in your own environment. We are not liable for any issues that arise from their use.\n4. Limitation of Liability In no event will Tweed Technology Ltd, nor its directors, employees, partners, agents, suppliers, or affiliates, be liable for any indirect, incidental, special, consequential or punitive damages, including without limitation, loss of profits, data, use, goodwill, or other intangible losses, resulting from your access to or use of or inability to access or use the website.\n5. External Links Disclaimer This website may contain links to external websites that are not provided or maintained by or in any way affiliated with Tweed Technology Ltd. Please note that we do not guarantee the accuracy, relevance, timeliness, or completeness of any information on these external websites.\n6. Governing Law These Terms shall be governed and construed in accordance with the laws of England and Wales, without regard to its conflict of law provisions.\n7. Changes We reserve the right, at our sole discretion, to modify or replace these Terms at any time. We will provide notice of any changes by posting the new Terms on this page.\n8. Contact Us If you have any questions about these Terms, please contact us.\n","date":"April 5, 2026","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/terms/","summary":"\u003ch2 id=\"terms-of-use\"\u003eTerms of Use\u003c/h2\u003e\n\u003cp\u003e\u003cstrong\u003eLast updated: 5 April 2026\u003c/strong\u003e\u003c/p\u003e\n\u003cp\u003ePlease read these Terms of Use (\u0026ldquo;Terms\u0026rdquo;) carefully before using the Tech Tweedie website (\u003ca href=\"https://techtweedie.github.io\" target=\"_blank\" rel=\"noopener\"\u003ehttps://techtweedie.github.io\u003c/a\u003e) operated by Tweed Technology Ltd.\u003c/p\u003e\n\u003cp\u003eYour access to and use of the website is conditioned on your acceptance of and compliance with these Terms. These Terms apply to all visitors, users, and others who access or use the website.\u003c/p\u003e\n\u003cp\u003eBy accessing or using the website, you agree to be bound by these Terms. If you disagree with any part of the terms, then you may not access the website.\u003c/p\u003e","tags":null,"title":"Terms of Use"},{"categories":["M365 Show"],"contents":"Watch the Episode About In this new episode of the D365 Show, we will channel just a touch of our Indiana Jones selves to explore a treasured topic: reclaiming the human side of CRM consulting.\nIn a world captivated by copilots, automated insights, and rapid deployments, some have traded core app skills for the latest AI tools. But technology alone does not transform organizations. People do.\nLet us navigate together:\nThe definition of CRM consulting and how it has evolved through the years What the hottest versus most critical skills are How the introduction of AI has changed the game Where real human impact is made, before and after project delivery How workplace culture can evolve to acknowledge more the humans behind the wheel Consulting is not about the gold at the end of the tunnel; it is about making sure the whole team makes it out of the temple alive, empowered, and ready for the next adventure.\nEvent Details Title: Raiders of the Lost Consulting Ark: Revolutionizing CRM Workplace Culture\nEvent by: M365 Show\nWhen: Wed, Mar 18, 2026, 4:00 PM GMT\nEvent Link: https://www.linkedin.com/events/raidersofthelostconsultingark-r7435352966477500417/theater/\nThose Involved Mirko Peters Francesco Di Donato Angeliki Patsiavou Daniel Barber Ian Tweedie ","date":"March 18, 2026","hero":"/podcasts/m365-show/260318-raiders-of-the-lost-consulting-ark-revolutionizing-crm-workplace-culture/image.png","permalink":"https://techtweedie.github.io/podcasts/m365-show/260318-raiders-of-the-lost-consulting-ark-revolutionizing-crm-workplace-culture/","summary":"","tags":["M365 Show","CRM","Consulting","AI","Workplace Culture"],"title":"Raiders of the Lost Consulting Ark: Revolutionizing CRM Workplace Culture"},{"categories":["Power Platform","Azure DevOps"],"contents":" 📋 REUSABLE PATTERN\nThis document provides a proven, reusable pattern for implementing Power Platform DevOps pipelines. Use this template to standardize your solution export and version control processes across your organization. Pattern Summary Pattern Name: Power Platform Solution Export Pipeline\nCategory: DevOps \u0026amp; CI/CD\nPlatform: Azure DevOps + Power Platform\nDifficulty: Beginner to Intermediate\nTime to Implement: 1-2 hours\nWhat This Pattern Solves Manual solution exports prone to human error Lack of version control for Power Platform solutions Inconsistent deployment processes across environments No audit trail for solution changes Difficulty collaborating on Power Platform development Pattern Outcomes After implementing this pattern, you will have:\n✅ Automated solution exports from your development environment\n✅ Version-controlled source code for all solution components\n✅ Consistent deployment artifacts (managed \u0026amp; unmanaged solutions)\n✅ Environment-specific settings files for configuration management\n✅ Audit trail of all solution changes through Git history\nPattern Overview This pattern demonstrates how to implement a simple yet effective DevOps pipeline for Power Platform solutions using Azure DevOps. The pipeline automates the export of solutions from your development environment, manages version control, and prepares your solutions for deployment across multiple environments.\nWhat This Pipeline Does flowchart TD A[Manual Trigger] --\u003e B[Checkout Code] B --\u003e C[Install Power Platform Tools] C --\u003e D[Set Solution Version] D --\u003e E[Export Managed Solution] E --\u003e F[Export Unmanaged Solution] F --\u003e G[Unpack Solution] G --\u003e H[Create Settings File] H --\u003e I[Commit \u0026 Push to Git] style A fill:#e1f5fe style I fill:#c8e6c9 Prerequisites Before implementing this pipeline, ensure you have:\nAzure DevOps Organization with appropriate permissions Power Platform Environment (Development/Source) Service Principal configured for Power Platform authentication Git Repository for solution source control Power Platform Build Tools extension installed in Azure DevOps Pipeline Configuration Variables The pipeline uses two key variables that you need to customize:\nVariable Description Example Value varPowerPlatformSPN Service connection name for Power Platform authentication Dataverse - Backup varSolutionName Name of the solution to export ProjectExpenseLogger Pipeline Steps Explained sequenceDiagram participant ADO as Azure DevOps participant PP as Power Platform participant Git as Git Repository ADO-\u003e\u003eADO: Install Power Platform Tools ADO-\u003e\u003ePP: Set Solution Version Note over PP: Version: 1.0.0.{BuildID} ADO-\u003e\u003ePP: Export Managed Solution ADO-\u003e\u003ePP: Export Unmanaged Solution ADO-\u003e\u003eADO: Unpack Solution Files ADO-\u003e\u003eADO: Generate Settings File ADO-\u003e\u003eGit: Commit All Changes ADO-\u003e\u003eGit: Push to Repository Complete Pipeline YAML name: EXPORT $(TeamProject)_$(BuildDefinitionName)_$(SourceBranchName)_$(Date:yyyyMMdd)$(Rev:.r) variables: - name: varPowerPlatformSPN # value: YOUR-OWN-VALUE-HERE value: Dataverse - Backup - name: varSolutionName # value: YOUR-OWN-VALUE-HERE value: ProjectExpenseLogger trigger: none pool: vmImage: \u0026#39;windows-latest\u0026#39; steps: - checkout: self persistCredentials: true clean: true - task: PowerPlatformToolInstaller@2 inputs: DefaultVersion: true AddToolsToPath: true - task: PowerPlatformSetSolutionVersion@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; SolutionName: \u0026#39;$(varSolutionName)\u0026#39; SolutionVersionNumber: \u0026#39;1.0.0.$(Build.BuildID)\u0026#39; - task: PowerPlatformExportSolution@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; SolutionName: \u0026#39;$(varSolutionName)\u0026#39; SolutionOutputFile: \u0026#39;$(Build.SourcesDirectory)\\solutions\\$(varSolutionName)_1.0.0.$(Build.BuildID)_managed.zip\u0026#39; Managed: true AsyncOperation: true MaxAsyncWaitTime: \u0026#39;60\u0026#39; - task: PowerPlatformExportSolution@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; SolutionName: \u0026#39;$(varSolutionName)\u0026#39; SolutionOutputFile: \u0026#39;$(Build.SourcesDirectory)\\solutions\\$(varSolutionName)_1.0.0.$(Build.BuildID).zip\u0026#39; Managed: false AsyncOperation: true MaxAsyncWaitTime: \u0026#39;60\u0026#39; - task: PowerPlatformUnpackSolution@2 inputs: SolutionInputFile: \u0026#39;$(Build.SourcesDirectory)\\solutions\\$(varSolutionName)_1.0.0.$(Build.BuildID).zip\u0026#39; SolutionTargetFolder: \u0026#39;$(Build.SourcesDirectory)\\solutions\\src\\$(varSolutionName)\u0026#39; SolutionType: \u0026#39;Both\u0026#39; - task: PowerShell@2 inputs: targetType: \u0026#39;inline\u0026#39; script: \u0026#39;pac solution create-settings --solution-zip $(Build.SourcesDirectory)\\solutions\\$(varSolutionName)_1.0.0.$(Build.BuildID).zip --settings-file $(Build.SourcesDirectory)\\solutions\\$(varSolutionName)-settings.json\u0026#39; - task: CmdLine@2 inputs: script: | echo commit all changes git config user.email \u0026#34;$(Build.RequestedForEmail)\u0026#34; git config user.name \u0026#34;$(Build.RequestedFor)\u0026#34; git checkout -b main git add --all git commit -m \u0026#34;Latest solution changes.\u0026#34; echo push code to new repo git -c http.extraheader=\u0026#34;AUTHORIZATION: bearer $(System.AccessToken)\u0026#34; push origin main Step-by-Step Breakdown 1. Repository Setup - checkout: self persistCredentials: true clean: true Purpose: Ensures we have access to the repository with credentials to push changes back persistCredentials: Enables the pipeline to push changes to Git clean: Starts with a clean working directory 2. Tool Installation - task: PowerPlatformToolInstaller@2 inputs: DefaultVersion: true AddToolsToPath: true Purpose: Installs the Power Platform CLI tools needed for solution management DefaultVersion: Uses the latest stable version AddToolsToPath: Makes tools available throughout the pipeline 3. Version Management - task: PowerPlatformSetSolutionVersion@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; SolutionName: \u0026#39;$(varSolutionName)\u0026#39; SolutionVersionNumber: \u0026#39;1.0.0.$(Build.BuildID)\u0026#39; Purpose: Sets a unique version number for the solution using the build ID Pattern: 1.0.0.{BuildNumber} ensures each export has a unique version Benefits: Enables proper version tracking and rollback capabilities 4. Solution Export (Managed) - task: PowerPlatformExportSolution@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; SolutionName: \u0026#39;$(varSolutionName)\u0026#39; SolutionOutputFile: \u0026#39;$(Build.SourcesDirectory)\\solutions\\$(varSolutionName)_1.0.0.$(Build.BuildID)_managed.zip\u0026#39; Managed: true AsyncOperation: true MaxAsyncWaitTime: \u0026#39;60\u0026#39; Purpose: Exports the managed version of the solution for production deployments Managed Solutions: Cannot be modified after import, ideal for production AsyncOperation: Handles large solutions that take time to export File Naming: Includes version number for easy identification 5. Solution Export (Unmanaged) - task: PowerPlatformExportSolution@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; SolutionName: \u0026#39;$(varSolutionName)\u0026#39; SolutionOutputFile: \u0026#39;$(Build.SourcesDirectory)\\solutions\\$(varSolutionName)_1.0.0.$(Build.BuildID).zip\u0026#39; Managed: false AsyncOperation: true MaxAsyncWaitTime: \u0026#39;60\u0026#39; Purpose: Exports the unmanaged version for development/testing environments Unmanaged Solutions: Can be modified after import, useful for development Use Case: Development and UAT environment deployments 6. Solution Unpacking - task: PowerPlatformUnpackSolution@2 inputs: SolutionInputFile: \u0026#39;$(Build.SourcesDirectory)\\solutions\\$(varSolutionName)_1.0.0.$(Build.BuildID).zip\u0026#39; SolutionTargetFolder: \u0026#39;$(Build.SourcesDirectory)\\solutions\\src\\$(varSolutionName)\u0026#39; SolutionType: \u0026#39;Both\u0026#39; Purpose: Unpacks the solution into individual files for version control Benefits: Enables file-by-file tracking of changes Supports merge conflict resolution Provides visibility into solution components Enables collaborative development 7. Settings File Generation - task: PowerShell@2 inputs: targetType: \u0026#39;inline\u0026#39; script: \u0026#39;pac solution create-settings --solution-zip $(Build.SourcesDirectory)\\solutions\\$(varSolutionName)_1.0.0.$(Build.BuildID).zip --settings-file $(Build.SourcesDirectory)\\solutions\\$(varSolutionName)-settings.json\u0026#39; Purpose: Creates a settings file that can be used to configure environment-specific settings Use Case: Different connection strings, URLs, or configurations per environment Format: JSON file containing deployment settings 8. Git Operations - task: CmdLine@2 inputs: script: | echo commit all changes git config user.email \u0026#34;$(Build.RequestedForEmail)\u0026#34; git config user.name \u0026#34;$(Build.RequestedFor)\u0026#34; git checkout -b main git add --all git commit -m \u0026#34;Latest solution changes.\u0026#34; echo push code to new repo git -c http.extraheader=\u0026#34;AUTHORIZATION: bearer $(System.AccessToken)\u0026#34; push origin main Purpose: Commits all exported and unpacked files to the repository Process: Configures Git with the build requester\u0026rsquo;s identity Switches to main branch Stages all changes Creates a commit with descriptive message Pushes changes back to the repository Architecture Overview graph TB subgraph \"Development Environment\" Dev[Power Platform Dev Environment] Sol[Solution: ProjectExpenseLogger] end subgraph \"Azure DevOps\" Pipeline[Export Pipeline] Artifacts[Build Artifacts] end subgraph \"Git Repository\" Source[Source Code] Managed[Managed Solutions] Settings[Settings Files] end subgraph \"Target Environments\" Test[Test Environment] UAT[UAT Environment] Prod[Production Environment] end Dev --\u003e Pipeline Pipeline --\u003e Artifacts Pipeline --\u003e Source Pipeline --\u003e Managed Pipeline --\u003e Settings Managed --\u003e Test Managed --\u003e UAT Managed --\u003e Prod Source --\u003e Test Settings --\u003e Test Settings --\u003e UAT Settings --\u003e Prod style Dev fill:#e3f2fd style Pipeline fill:#fff3e0 style Source fill:#e8f5e8 style Prod fill:#fce4ec Benefits of This Approach Version Control Complete History: Track every change to your Power Platform solution Branching: Support multiple developers working on different features Rollback: Easy to revert to previous versions when needed Compliance: Audit trail for regulatory requirements Automated Exports Consistency: Same export process every time Scheduled: Can be triggered manually or on schedule Reliable: Reduces human error in export process Efficient: Saves developer time Multi-Environment Support Managed Solutions: For production deployments Unmanaged Solutions: For development/testing Settings Files: Environment-specific configurations Artifacts: Versioned deployment packages File Structure After Export /solutions/ ├── ProjectExpenseLogger_1.0.0.123_managed.zip # Managed solution ├── ProjectExpenseLogger_1.0.0.123.zip # Unmanaged solution ├── ProjectExpenseLogger-settings.json # Deployment settings └── src/ └── ProjectExpenseLogger/ # Unpacked source ├── CanvasApps/ ├── Entities/ ├── OptionSets/ ├── Roles/ ├── Workflows/ └── Other/ Next Steps Once you have this export pipeline working, consider adding:\nImport Pipeline: Automate deployment to test environments Solution Checker: Validate solution quality automatically Environment Variables: Better configuration management Approval Gates: Control production deployments Notifications: Alert teams of successful exports/deployments Common Issues and Solutions Authentication Problems Verify Service Principal has correct permissions Check Power Platform Service Connection configuration Ensure SPN has System Administrator role in target environment Solution Export Failures Check solution dependencies are included Verify solution exists in the source environment Increase MaxAsyncWaitTime for large solutions Git Push Issues Ensure pipeline has Contribute permissions to repository Check branch policies don\u0026rsquo;t block automated commits Verify System.AccessToken is available to the pipeline Customization Options Variable Modifications variables: - name: varEnvironmentUrl value: https://yourorg.crm.dynamics.com - name: varBranchName value: $(Build.SourceBranchName) - name: varVersionMajor value: 1 - name: varVersionMinor value: 0 Advanced Version Numbering SolutionVersionNumber: \u0026#39;$(varVersionMajor).$(varVersionMinor).$(Build.BuildID).$(Rev:r)\u0026#39; Multiple Solutions Export Duplicate the export tasks for each solution you want to include in your pipeline.\nThis pattern provides a solid foundation for Power Platform DevOps practices and can be extended based on your specific organizational needs.\nPattern Implementation Checklist Use this checklist to ensure successful implementation:\nBefore You Start Azure DevOps organization with admin access Power Platform development environment Service Principal created and configured Git repository prepared for solution storage Power Platform Build Tools extension installed Implementation Steps Create new Azure DevOps pipeline Configure variables (varPowerPlatformSPN, varSolutionName) Set up service connection to Power Platform Copy and customize the YAML pipeline code Test pipeline execution with a sample solution Verify Git commits and solution artifacts Document pipeline for team use After Implementation Train team on pipeline usage Establish solution naming conventions Set up monitoring and notifications Plan for import/deployment pipelines Regular pipeline maintenance and updates Related Patterns Consider implementing these complementary patterns:\nPower Platform Import Pipeline - Deploy solutions to target environments Solution Checker Integration - Automated quality validation Environment Variable Management - Configuration across environments Approval Gates Pattern - Control production deployments Multi-Solution Pipeline - Handle multiple solutions in one pipeline This pattern is part of a comprehensive Power Platform DevOps pattern library. For more patterns and best practices, visit our pattern collection.\n","date":"March 10, 2026","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/pattern/ppcicd/power-platform-export-to-devops-simple-pipeline/","summary":"\u003cdiv class=\"alert info\"\u003e\n    \u003cspan\u003e\u003ci data-feather=\"info\"\u003e\u003c/i\u003e\u003c/span\u003e\n    \u003cspan\u003e\u003cstrong\u003e\u003cstrong\u003e📋 REUSABLE PATTERN\u003c/strong\u003e\u003cbr\u003e\nThis document provides a proven, reusable pattern for implementing Power Platform DevOps pipelines. Use this template to standardize your solution export and version control processes across your organization.\u003c/strong\u003e\u003c/span\u003e\n\u003c/div\u003e\n\n\u003ch2 id=\"pattern-summary\"\u003ePattern Summary\u003c/h2\u003e\n\u003cp\u003e\u003cstrong\u003ePattern Name\u003c/strong\u003e: Power Platform Solution Export Pipeline\u003cbr\u003e\n\u003cstrong\u003eCategory\u003c/strong\u003e: DevOps \u0026amp; CI/CD\u003cbr\u003e\n\u003cstrong\u003ePlatform\u003c/strong\u003e: Azure DevOps + Power Platform\u003cbr\u003e\n\u003cstrong\u003eDifficulty\u003c/strong\u003e: Beginner to Intermediate\u003cbr\u003e\n\u003cstrong\u003eTime to Implement\u003c/strong\u003e: 1-2 hours\u003c/p\u003e\n\u003ch3 id=\"what-this-pattern-solves\"\u003eWhat This Pattern Solves\u003c/h3\u003e\n\u003cul\u003e\n\u003cli\u003eManual solution exports prone to human error\u003c/li\u003e\n\u003cli\u003eLack of version control for Power Platform solutions\u003c/li\u003e\n\u003cli\u003eInconsistent deployment processes across environments\u003c/li\u003e\n\u003cli\u003eNo audit trail for solution changes\u003c/li\u003e\n\u003cli\u003eDifficulty collaborating on Power Platform development\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch3 id=\"pattern-outcomes\"\u003ePattern Outcomes\u003c/h3\u003e\n\u003cp\u003eAfter implementing this pattern, you will have:\u003c/p\u003e","tags":["Power Platform","Azure DevOps","DevOps","CI/CD","Solution Management"],"title":"Power Platform EXPORT to DevOps - Simple Pipeline"},{"categories":["Power Platform","Azure DevOps"],"contents":" 📋 REUSABLE PATTERN\nThis document provides a proven, reusable pattern for implementing Power Platform solution deployment pipelines. Use this template to standardize your solution import and deployment processes across your organization. Pattern Summary Pattern Name: Power Platform Solution Import Pipeline\nCategory: DevOps \u0026amp; CI/CD\nPlatform: Azure DevOps + Power Platform\nDifficulty: Beginner to Intermediate\nTime to Implement: 1-2 hours\nWhat This Pattern Solves Manual solution deployments prone to human error Inconsistent deployment processes across environments Lack of automated testing after deployment No standardized approach for environment-specific configurations Difficulty maintaining deployment consistency across teams Pattern Outcomes After implementing this pattern, you will have:\n✅ Automated solution packaging from source-controlled files\n✅ Consistent deployments across all target environments\n✅ Environment-specific configuration management\n✅ Standardized deployment process reducing manual errors\n✅ Audit trail of all deployments through Azure DevOps history\nPattern Overview This pattern demonstrates how to implement a simple yet effective deployment pipeline for Power Platform solutions using Azure DevOps. The pipeline takes source-controlled solution files, packages them into deployable solutions, and imports them into target Power Platform environments.\nWhat This Pipeline Does flowchart TD A[Manual/Automated Trigger] --\u003e B[Checkout Source Code] B --\u003e C[Install Power Platform Tools] C --\u003e D[Pack Solution from Source] D --\u003e E[Deploy to Target Environment] E --\u003e F[Verify Deployment Success] style A fill:#e1f5fe style F fill:#c8e6c9 Prerequisites Before implementing this pipeline, ensure you have:\nAzure DevOps Organization with appropriate permissions Power Platform Target Environment (Test/UAT/Production) Service Principal configured for Power Platform authentication Git Repository containing unpacked solution source files Power Platform Build Tools extension installed in Azure DevOps Source solution files from a corresponding export pipeline Pipeline Configuration Variables The pipeline uses key variables that you need to customize:\nVariable Description Example Value varPowerPlatformSPN Service connection name for Power Platform authentication Dataverse - mightora varSolutionName Name of the solution to import FirstPipeline varTargetEnvironment Target environment URL (optional) https://mightora.crm11.dynamics.com/ Pipeline Steps Explained sequenceDiagram participant ADO as Azure DevOps participant Repo as Git Repository participant PP as Power Platform ADO-\u003e\u003eRepo: Checkout Source Files ADO-\u003e\u003eADO: Install Power Platform Tools ADO-\u003e\u003eADO: Pack Solution from Source Note over ADO: Creates deployable .zip ADO-\u003e\u003ePP: Import Solution Note over PP: Deploys to environment ADO-\u003e\u003ePP: Verify Import Status Complete Pipeline YAML name: IMPORT $(TeamProject)_$(BuildDefinitionName)_$(SourceBranchName)_$(Date:yyyyMMdd)$(Rev:.r) variables: - name: varSolutionName # value: YOUR-OWN-VALUE-HERE value: FirstPipeline - name: varPowerPlatformSPN # value: YOUR-OWN-VALUE-HERE value: Dataverse - mightora - name: varTargetEnvironment # value: YOUR-OWN-VALUE-HERE value: https://mightora.crm11.dynamics.com/ trigger: none pool: vmImage: \u0026#39;windows-latest\u0026#39; steps: - checkout: self persistCredentials: true clean: true - task: PowerPlatformToolInstaller@2 inputs: DefaultVersion: true AddToolsToPath: true - task: PowerPlatformPackSolution@2 inputs: SolutionSourceFolder: \u0026#39;$(Build.SourcesDirectory)\\solutions\\src\\$(varSolutionName)\u0026#39; SolutionOutputFile: \u0026#39;$(Build.ArtifactStagingDirectory)\\solutions\\build\\$(varSolutionName).zip\u0026#39; SolutionType: \u0026#39;Unmanaged\u0026#39; - task: PowerPlatformImportSolution@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; Environment: \u0026#39;$(varTargetEnvironment)\u0026#39; SolutionInputFile: \u0026#39;$(Build.ArtifactStagingDirectory)\\solutions\\build\\$(varSolutionName).zip\u0026#39; AsyncOperation: true MaxAsyncWaitTime: \u0026#39;60\u0026#39; PublishWorkflows: true Step-by-Step Breakdown 1. Repository Setup - checkout: self persistCredentials: true clean: true Purpose: Retrieves the source-controlled solution files from the repository persistCredentials: Enables potential Git operations during deployment clean: Ensures a clean working directory for consistent builds 2. Tool Installation - task: PowerPlatformToolInstaller@2 inputs: DefaultVersion: true AddToolsToPath: true Purpose: Installs the Power Platform CLI tools needed for solution packaging and deployment DefaultVersion: Uses the latest stable version AddToolsToPath: Makes tools available throughout the pipeline 3. Solution Packaging - task: PowerPlatformPackSolution@2 inputs: SolutionSourceFolder: \u0026#39;$(Build.SourcesDirectory)\\solutions\\src\\$(varSolutionName)\u0026#39; SolutionOutputFile: \u0026#39;$(Build.ArtifactStagingDirectory)\\solutions\\build\\$(varSolutionName).zip\u0026#39; SolutionType: \u0026#39;Unmanaged\u0026#39; Purpose: Packages the source-controlled files into a deployable solution zip file SolutionSourceFolder: Location of unpacked solution files from your export pipeline SolutionOutputFile: Where the packaged solution will be saved SolutionType: Creates an unmanaged solution for development/testing environments 4. Solution Import/Deployment - task: PowerPlatformImportSolution@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; Environment: \u0026#39;$(varTargetEnvironment)\u0026#39; SolutionInputFile: \u0026#39;$(Build.ArtifactStagingDirectory)\\solutions\\build\\$(varSolutionName).zip\u0026#39; AsyncOperation: true MaxAsyncWaitTime: \u0026#39;60\u0026#39; PublishWorkflows: true Purpose: Imports the packaged solution into the target Power Platform environment Environment: Target environment URL where solution will be deployed AsyncOperation: Handles large solutions that take time to import MaxAsyncWaitTime: Maximum wait time for import completion PublishWorkflows: Automatically activates workflows after import Architecture Overview graph TB subgraph \"Source Control\" Source[Solution Source Files] Config[Environment Configs] end subgraph \"Azure DevOps\" Pipeline[Import Pipeline] Build[Build Artifacts] Pack[Solution Packaging] end subgraph \"Target Environments\" Test[Test Environment] UAT[UAT Environment] Prod[Production Environment] end Source --\u003e Pipeline Config --\u003e Pipeline Pipeline --\u003e Pack Pack --\u003e Build Build --\u003e Test Build --\u003e UAT Build --\u003e Prod style Source fill:#e8f5e8 style Pipeline fill:#fff3e0 style Test fill:#e3f2fd style UAT fill:#f3e5f5 style Prod fill:#fce4ec Benefits of This Approach Automated Deployments Consistency: Same deployment process every time Reliability: Reduces human error in deployment process Speed: Faster than manual deployments Repeatability: Can deploy to multiple environments with same process Environment Management Environment-Specific: Different configurations per environment Controlled: Managed through Azure DevOps permissions and approvals Traceable: Full audit trail of what was deployed when Rollback: Easy to redeploy previous versions Integration Benefits CI/CD Integration: Works with broader DevOps practices Automated Testing: Can include post-deployment validation Notifications: Automatic alerts on success/failure Approval Gates: Manual approvals for production deployments Deployment Flow sequenceDiagram participant Dev as Developer participant Repo as Git Repository participant ADO as Azure DevOps participant Test as Test Environment participant Prod as Production Dev-\u003e\u003eRepo: Push solution changes Repo-\u003e\u003eADO: Trigger import pipeline ADO-\u003e\u003eADO: Package solution ADO-\u003e\u003eTest: Deploy to test Note over Test: Automated testing ADO-\u003e\u003eADO: Wait for approval Note over ADO: Manual gate ADO-\u003e\u003eProd: Deploy to production ADO-\u003e\u003eDev: Notify completion Common Use Cases Development Environment Deployments Deploy latest changes for developer testing Quick iteration and feedback cycles Automated deployment on code commits UAT Environment Deployments Scheduled deployments for user testing Integration with test automation Approval gates for quality control Production Deployments Controlled releases with approvals Deployment windows and scheduling Rollback capabilities for issues File Structure Expected This pipeline expects the following structure from your export pipeline:\n/solutions/ └── src/ └── {SolutionName}/ # Unpacked solution folder ├── CanvasApps/ ├── Entities/ ├── OptionSets/ ├── Roles/ ├── Workflows/ ├── Other/ └── Solution.xml # Solution definition Environment-Specific Configurations Multiple Environment Variables variables: - name: varTestEnvironment value: https://test-org.crm.dynamics.com/ - name: varUATEnvironment value: https://uat-org.crm.dynamics.com/ - name: varProdEnvironment value: https://prod-org.crm.dynamics.com/ Conditional Deployment - task: PowerPlatformImportSolution@2 condition: and(succeeded(), eq(variables[\u0026#39;Build.SourceBranch\u0026#39;], \u0026#39;refs/heads/main\u0026#39;)) inputs: Environment: \u0026#39;$(varProdEnvironment)\u0026#39; # ... other inputs Common Issues and Solutions Packaging Failures Missing Files: Ensure all solution components are in source control Invalid Structure: Verify folder structure matches Power Platform standards Dependencies: Check that dependent solutions are already in target environment Import Failures Environment Access: Verify Service Principal has System Administrator role Dependency Issues: Import dependent solutions first Customization Conflicts: Resolve conflicts with existing customizations Authentication Problems Service Connection: Verify Power Platform Service Connection is properly configured Permissions: Ensure SPN has appropriate permissions in target environment Environment URL: Confirm the target environment URL is correct and accessible Advanced Configurations Solution Settings Integration - task: PowerPlatformApplySettings@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; Environment: \u0026#39;$(varTargetEnvironment)\u0026#39; SolutionName: \u0026#39;$(varSolutionName)\u0026#39; SettingsFile: \u0026#39;$(Build.SourcesDirectory)\\solutions\\$(varSolutionName)-settings.json\u0026#39; Post-Deployment Validation - task: PowerShell@2 displayName: \u0026#39;Validate Deployment\u0026#39; inputs: targetType: \u0026#39;inline\u0026#39; script: | # Add custom validation scripts here Write-Host \u0026#34;Validating solution deployment...\u0026#34; # Check for specific entities, processes, etc. Managed Solution Deployment For production environments, consider using managed solutions:\n- task: PowerPlatformPackSolution@2 inputs: SolutionSourceFolder: \u0026#39;$(Build.SourcesDirectory)\\solutions\\src\\$(varSolutionName)\u0026#39; SolutionOutputFile: \u0026#39;$(Build.ArtifactStagingDirectory)\\solutions\\build\\$(varSolutionName)_managed.zip\u0026#39; SolutionType: \u0026#39;Managed\u0026#39; Integration with Export Pipeline This import pipeline works best when paired with the export pipeline:\nExport Pipeline creates source-controlled files Import Pipeline deploys those files to target environments Continuous Flow from development to production Pipeline Dependencies resources: pipelines: - pipeline: ExportPipeline source: \u0026#39;Power Platform Export Pipeline\u0026#39; trigger: branches: include: - main Next Steps Once you have this import pipeline working, consider adding:\nApproval Gates: Manual approvals for production deployments Automated Testing: Post-deployment validation scripts Rollback Capability: Quick reversion to previous versions Multi-Stage Deployments: Deploy to multiple environments in sequence Solution Checker Integration: Validate solution quality before deployment Pattern Implementation Checklist Use this checklist to ensure successful implementation:\nBefore You Start Azure DevOps organization with admin access Target Power Platform environments configured Service Principal created and configured for target environments Source solution files available in repository Power Platform Build Tools extension installed Implementation Steps Create new Azure DevOps import pipeline Configure variables (varPowerPlatformSPN, varSolutionName, varTargetEnvironment) Set up service connections to target Power Platform environments Copy and customize the YAML pipeline code Test pipeline execution with a sample solution Verify successful deployment in target environment Document pipeline for team use After Implementation Set up approval gates for production environments Configure deployment notifications Establish deployment schedules and windows Train team on deployment process Plan for rollback procedures Related Patterns Consider implementing these complementary patterns:\nPower Platform Export Pipeline - Source control your solutions Multi-Stage Deployment Pipeline - Deploy to multiple environments Solution Checker Integration - Automated quality validation Environment Variable Management - Configuration across environments Approval Gates Pattern - Control production deployments This pattern is part of a comprehensive Power Platform DevOps pattern library. For more patterns and best practices, visit our pattern collection.\n","date":"March 10, 2026","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/pattern/ppcicd/power-platform-import-from-devops-simple-pipeline/","summary":"\u003cdiv class=\"alert info\"\u003e\n    \u003cspan\u003e\u003ci data-feather=\"info\"\u003e\u003c/i\u003e\u003c/span\u003e\n    \u003cspan\u003e\u003cstrong\u003e\u003cstrong\u003e📋 REUSABLE PATTERN\u003c/strong\u003e\u003cbr\u003e\nThis document provides a proven, reusable pattern for implementing Power Platform solution deployment pipelines. Use this template to standardize your solution import and deployment processes across your organization.\u003c/strong\u003e\u003c/span\u003e\n\u003c/div\u003e\n\n\u003ch2 id=\"pattern-summary\"\u003ePattern Summary\u003c/h2\u003e\n\u003cp\u003e\u003cstrong\u003ePattern Name\u003c/strong\u003e: Power Platform Solution Import Pipeline\u003cbr\u003e\n\u003cstrong\u003eCategory\u003c/strong\u003e: DevOps \u0026amp; CI/CD\u003cbr\u003e\n\u003cstrong\u003ePlatform\u003c/strong\u003e: Azure DevOps + Power Platform\u003cbr\u003e\n\u003cstrong\u003eDifficulty\u003c/strong\u003e: Beginner to Intermediate\u003cbr\u003e\n\u003cstrong\u003eTime to Implement\u003c/strong\u003e: 1-2 hours\u003c/p\u003e\n\u003ch3 id=\"what-this-pattern-solves\"\u003eWhat This Pattern Solves\u003c/h3\u003e\n\u003cul\u003e\n\u003cli\u003eManual solution deployments prone to human error\u003c/li\u003e\n\u003cli\u003eInconsistent deployment processes across environments\u003c/li\u003e\n\u003cli\u003eLack of automated testing after deployment\u003c/li\u003e\n\u003cli\u003eNo standardized approach for environment-specific configurations\u003c/li\u003e\n\u003cli\u003eDifficulty maintaining deployment consistency across teams\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch3 id=\"pattern-outcomes\"\u003ePattern Outcomes\u003c/h3\u003e\n\u003cp\u003eAfter implementing this pattern, you will have:\u003c/p\u003e","tags":["Power Platform","Azure DevOps","DevOps","CI/CD","Solution Deployment","Import Pipeline"],"title":"Power Platform IMPORT from DevOps - Simple Pipeline"},{"categories":["Power Platform Monthly Call"],"contents":"Watch the Demo About This demo shows a DevOps-based approach to generating Power Platform documentation automatically from solutions stored in Azure DevOps. It demonstrates how solution metadata can be converted into Markdown, diagrams, and Word documents as part of a pipeline, removing the need for manual documentation.\nThis demo was presented at the Power Platform Monthly Community Call on the 19th of November 2025.\nEveryone is welcome to these calls — download recurrent invites from https://aka.ms/community/calls.\nEvent Details Event by: Microsoft Power Platform Community\nWhen: Wed, Nov 19, 2025\nEvent Link: https://aka.ms/community/calls\nResources Generate Power Platform Solution Documentation with Azure DevOps Power Platform Documentation Extension — Visual Studio Marketplace Feedback for Ian Those Involved Ian Tweedie ","date":"November 19, 2025","hero":"/podcasts/pp-monthly-call/251119-pp-monthly-call-power-platform-documentation-azure-devops/image.jpg","permalink":"https://techtweedie.github.io/podcasts/pp-monthly-call/251119-pp-monthly-call-power-platform-documentation-azure-devops/","summary":"","tags":["Power Platform","Azure DevOps","Documentation","DevOps","Community Call"],"title":"Generate Power Platform Solution Documentation with Azure DevOps"},{"categories":["How to"],"contents":"Your First Azure DevOps Pipeline for Power Platform – A Complete Beginner’s Guide Have you ever wondered how to set up Azure DevOps pipelines for the Power Platform?\nEver wished you could export your solutions safely, run tests, add security checks, generate documentation, or deploy across tenants with confidence?\nAll of this becomes simple once you introduce Azure DevOps into your Power Platform ALM story — and in this guide (based on my YouTube walkthrough), I’ll show you exactly how to build your first Azure DevOps pipeline.\nWhether you\u0026rsquo;re brand new to DevOps or just automating your Power Platform workflows, this guide will get you from nothing to a working pipeline step-by-step.\n🎥 Watch the video tutorial Introduction In this walkthrough, we’re going to build a simple Azure DevOps pipeline that:\n✔ Connects securely to your Dataverse environment\n✔ Exports a Power Platform solution\n✔ Unpacks it into source control\n✔ Commits the changes to your repository\nLater, you can extend this pipeline to build, deploy, test, create documentation, enforce ALM governance, and much more — but today we focus on getting your first working pipeline running.\nPrerequisites Before we touch DevOps, we need three things.\n1. Dataverse System Administrator permissions You need the ability to add/remove users (because we’ll add a service principal).\n2. An Azure DevOps organisation (Basic licence) The first five are free, so you\u0026rsquo;re almost certainly covered.\n3. Parallelism Request (free) Azure DevOps uses a hosted VM to run pipelines, but you must request free parallelism.\nSubmit it here:\nhttps://aka.ms/azpipelines-parallelism-request\nApproval normally takes a few minutes.\nAfter that, install the Power Platform Build Tools extension from the Visual Studio Marketplace.\nStep 1 — Create an App Registration We need a service principal so our pipeline can authenticate to Dataverse.\nOpen Entra Admin Center → App Registrations Select New Registration Add API permission Dynamic CRM → user_impersonation No admin consent required Copy: Client ID Tenant ID Secret Keep them safe — we’ll need them later.\nStep 2 — Add the App User in Dataverse In the Power Platform Admin Center:\nOpen your environment Go to Settings → Users → Application Users Click New App User Choose your App Registration Assign System Administrator (for demo purposes) Step 3 — Create the DevOps Service Connection In Azure DevOps:\nOpen Project Settings → Service Connections Select New Service Connection → Power Platform Enter: Environment URL Client ID Tenant ID Secret Name it something like Dataverse - Backup This will be the authentication method for your pipeline.\nStep 4 — Structure Your Repository A recommended structure:\nMyPowerPlatformProject/ ├── solutions/ │ ├── src/ │ │ └── MySolution/ │ └── MySolution.zip ├── pipelines/ │ ├── export-solution.yml │ ├── build-and-deploy-solution.yml ├── documentation/ └── README.md Step 5 — Create Your First Pipeline Inside Pipelines → New Pipeline:\nChoose Azure Repos Git Select your repo Choose Starter Pipeline Save it under:\n/pipelines/export-solution.yml Your first run will likely fail due to parallelism — this is normal.\nStep 6 — Fix Parallelism (If Needed) If you get an error about free parallelism not being enabled:\n👉 Submit the form: https://aka.ms/azpipelines-parallelism-request\nOnce approved, rerun your pipeline — it should now work.\nStep 7 — Build the Export Pipeline Now we turn the basic boilerplate pipeline into something useful.\n1. Add name + variables name: $(TeamProject)_$(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.r) variables: - name: varPowerPlatformSPN value: Dataverse - Backup - name: varSolutionName value: ExpenseReportManager1 2. Use Windows build agent pool: vmImage: \u0026#39;windows-latest\u0026#39; 4. Pipeline Steps steps: - checkout: self persistCredentials: true clean: true - task: PowerPlatformToolInstaller@2 inputs: DefaultVersion: true AddToolsToPath: true - task: PowerPlatformSetSolutionVersion@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; SolutionName: \u0026#39;$(varSolutionName)\u0026#39; SolutionVersionNumber: \u0026#39;1.0.0.$(Build.BuildID)\u0026#39; - task: PowerPlatformExportSolution@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; SolutionName: \u0026#39;$(varSolutionName)\u0026#39; SolutionOutputFile: \u0026#39;$(Build.SourcesDirectory)\\solutions\\$(varSolutionName)_1.0.0.$(Build.BuildID)_managed.zip\u0026#39; Managed: true AsyncOperation: true MaxAsyncWaitTime: \u0026#39;60\u0026#39; - task: PowerPlatformExportSolution@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; SolutionName: \u0026#39;$(varSolutionName)\u0026#39; SolutionOutputFile: \u0026#39;$(Build.SourcesDirectory)\\solutions\\$(varSolutionName)_1.0.0.$(Build.BuildID).zip\u0026#39; Managed: false AsyncOperation: true MaxAsyncWaitTime: \u0026#39;60\u0026#39; - task: PowerPlatformUnpackSolution@2 inputs: SolutionInputFile: \u0026#39;$(Build.SourcesDirectory)\\solutions\\$(varSolutionName)_1.0.0.$(Build.BuildID).zip\u0026#39; SolutionTargetFolder: \u0026#39;$(Build.SourcesDirectory)\\solutions\\src\\$(varSolutionName)\u0026#39; SolutionType: \u0026#39;Both\u0026#39; - task: PowerShell@2 inputs: targetType: \u0026#39;inline\u0026#39; script: \u0026#39;pac solution create-settings --solution-zip $(Build.SourcesDirectory)\\solutions\\$(varSolutionName)_1.0.0.$(Build.BuildID).zip --settings-file $(Build.SourcesDirectory)\\solutions\\$(varSolutionName)-settings.json\u0026#39; 3. Commit to the repository - task: CmdLine@2 inputs: script: | echo commit all changes git config user.email \u0026#34;$(Build.RequestedForEmail)\u0026#34; git config user.name \u0026#34;$(Build.RequestedFor)\u0026#34; git checkout -b main git add --all git commit -m \u0026#34;Latest solution changes.\u0026#34; echo push code to new repo git -c http.extraheader=\u0026#34;AUTHORIZATION: bearer $(System.AccessToken)\u0026#34; push origin main Step 8 — Fix Build Service Permissions (If Needed) If you see:\n“Build Service does not have Contribute permissions”\nFix it here:\nProject Settings → Repositories → {repo} → Security\nFind:\nProjectName Build Service (ProjectName)\nSet Contribute = Allow\nRe-run — pipeline succeeds.\nStep 9 — Review the Exported Solution Your repo now contains:\n✔ Managed ZIP ✔ Unmanaged ZIP ✔ Settings file ✔ Unpacked solution including: Entities Model-driven apps Canvas apps (if any) Workflows Security roles Choices Relationships Customizations Plans Everything you need for full Git-based version control.\nConclusion 🎉 Congratulations — you now have a working Azure DevOps export pipeline!\nThis pipeline gives you:\n✔ A safe, repeatable backup\n✔ Git history of every solution change\n✔ Managed + unmanaged builds\n✔ A foundation for full ALM automation\nFrom here, you can expand into:\nDeployment pipelines Automated versioning Power Pages / Portal pipelines Automated documentation Solution analysis \u0026amp; testing Cross-tenant deployments Pull request validation If you’d like the next blog/video to cover solution deployment, unit testing, wiki documentation generation, or advanced ALM, just let me know!\n","date":"November 18, 2025","hero":"/posts/251118-your-first-azure-devops-pipeline-for-power-platform/large.png","permalink":"https://techtweedie.github.io/posts/251118-your-first-azure-devops-pipeline-for-power-platform/","summary":"","tags":["Power Platform","DevOps"],"title":"🚀 Build Your First Azure DevOps Pipeline for Power Platform (Complete Beginner Tutorial!)"},{"categories":["Community"],"contents":"D365 \u0026amp; Power Platform UG North East – September 2025 Meetup 📅 Date: Tuesday, September 30, 2025\n🕕 Time: 6:00 PM – 8:10 PM BST\n📍 Location: Haylofts, 5 Saint Thomas\u0026rsquo; Street, Newcastle upon Tyne\n🎟️ Register Now: https://go.iantweedie.biz/meetup-d365ppugne-2509\nCo-Hosted By This event is proudly co-hosted by:\nAgata Guziur Justin Wilkinson Ian Tweedie Together, we’re bringing the North East community a fantastic evening of learning, sharing, and connection.\nConnect. Learn. Share. We’re excited to invite you to the June 2025 meetup of the D365 \u0026amp; Power Platform User Group North East — an evening of community learning, insight sharing, and connection with local professionals who are passionate about the Microsoft stack.\nWhether you\u0026rsquo;re a seasoned solution architect, a curious maker, or simply Power Platform-curious, you’ll find valuable sessions and great conversation.\nAgenda 18:15 – Welcome, networking, and introductions\n18:30 – Secure and Test your D365 processes with ML-powered platform capabilities\nSpeaker: Anthonio Dixon\nAnthonio will be exploring how machine learning can help you secure and test Dynamics 365 processes. Expect insights into practical implementation and how ML can help safeguard critical business applications.\n19:15 – Break, food, and networking\n19:30 – Capability \u0026amp; Standards in Power BI\nSpeaker: Jamie Shields\nJamie will share how his team has developed Power BI standards to bring consistency to their end-to-end delivery. This session will cover why standards matter, how to cut through the noise of multiple \u0026ldquo;best practices,\u0026rdquo; and practical lessons you can take back to your own team.\nWhy Attend? 💡 Fresh insights from real-world experts 👥 Connect with the local D365 and Power Platform community 🧠 Learn something new — whether technical, strategic, or architectural 🍕 Free food and drinks — and great company 👉 Reserve Your Spot 🎟️ Click here to register for the event\nSpaces are limited — grab your seat early!\nFor post-event updates, reflections, and session write-ups — check back here or follow along on LinkedIn.\n","date":"September 30, 2025","hero":"/events/250930-d365ppugne/large.png","permalink":"https://techtweedie.github.io/events/250930-d365ppugne/","summary":"","tags":["Power Platform","Power Apps","Power Automate","Community Event","Canvas Apps","AI","Microsoft 365"],"title":"D365 \u0026 Power Platform UG North East – September 2025 Meetup"},{"categories":["Community"],"contents":"📅 Date: Wednesday, September 10, 2025\n🕒 Time: 10:45 AM – 11:30 AM BST\n📍 Location: UK D365 and Power Platform User Group, Microsoft UA92 Manchester, Room 102/103\n🎟️ Session Link: https://d365ppug-national-10092025.sessionize.com/session/991575\nSession Overview In this hands-on, demo-heavy session, Ian Tweedie (TechTweedie), Power Platform Technical Consultant at Capgemini, will showcase how to integrate automated UI testing into your low-code CI/CD workflows. Using the FREE Open Source Playwright for Power Platform DevOps Extension, you’ll learn how to:\nCreate end-to-end tests for Model-Driven Apps and Power Pages. Automate test user permissions. Generate rich HTML reports directly in Azure DevOps. Expect practical guidance, live demos, and a repeatable recipe to take home: Commit → Automatic UI Tests → Deploy.\nWhat You’ll Learn Why automated testing is essential for Power Platform projects. How to set up the Playwright for Power Platform DevOps Extension. Writing and running your first automated UI test. Viewing and interpreting test results in Azure DevOps. Bonus: How to integrate disposable test users for clean, repeatable tests. Demos Include Forking the starter repository. Using Playwright CodeGen to create tests. Setting up and running an Azure DevOps pipeline. Viewing test results and HTML reports. Managing disposable test users. Why Attend? 🚀 Hands-On Learning: See real-world demos and get practical tips. 🛠️ Tools You Can Use: Learn about the free Playwright extension and starter templates. 📂 Takeaways: Leave with a complete recipe for automated testing in your CI/CD pipeline. Slides Link to Slides\nGot questions Got a question or feeling a little stuck, just submit it to the Power Platform Clinic and I will be happy to answer it for you!\nResources Get Started with Playwright for Model Driven Apps by James Ryan Playwright for Power Platform DevOps Extension Starter Repository Template CodeGen Documentation Starter Repository Template Related YouTube video DevOps Pipeline name: $(TeamProject)_$(BuildDefinitionName)_$(SourceBranchName)_$(Date:yyyyMMdd)$(Rev:.r) trigger: - none pool: vmImage: windows-latest steps: - checkout: self - task: mightoria-playwrightForPowerPlatformAdvanced@1 inputs: testLocation: \u0026#39;$(System.DefaultWorkingDirectory)/tests\u0026#39; browser: \u0026#39;chromium\u0026#39; trace: \u0026#39;on\u0026#39; outputLocation: \u0026#39;$(System.DefaultWorkingDirectory)\u0026#39; appUrl: \u0026#39;https://techtweedie.crm11.dynamics.com/main.aspx?appid=6653f9fc-b74b-f011-877a-6045bd0e2fc6\u0026#39; appName: \u0026#39;MDA Playwright Testing\u0026#39; o365Username: \u0026#39;playwright-test@Tweed.technology\u0026#39; o365Password: \u0026#39;$(o365Password)\u0026#39; tenantId: \u0026#39;63759d9f-bfca-4f52-ae98-8f2f1d7bc173\u0026#39; dynamicsUrl: \u0026#39;techtweedie.crm11.dynamics.com\u0026#39; clientId: \u0026#39;6f3163d1-bd41-4f0e-8725-980f05d2a82f\u0026#39; clientSecret: \u0026#39;$(ClientSecret)\u0026#39; userRole: \u0026#39;System Administrator\u0026#39; team: \u0026#39;orgbfc42920\u0026#39; businessUnit: \u0026#39;orgbfc42920\u0026#39; - task: ArchiveFiles@2 inputs: rootFolderOrFile: \u0026#39;$(System.DefaultWorkingDirectory)/playwright-report\u0026#39; includeRootFolder: true archiveType: \u0026#39;zip\u0026#39; archiveFile: \u0026#39;$(System.DefaultWorkingDirectory)/playwright-report/playwright-report.zip\u0026#39; replaceExistingArchive: true - publish: $(System.DefaultWorkingDirectory)/playwright-report/ artifact: playwright-report # always create the artifact, this is useful for debugging failed tests condition: always() - task: PublishTestResults@2 inputs: testResultsFormat: \u0026#39;JUnit\u0026#39; testResultsFiles: \u0026#39;**/TEST-*.xml\u0026#39; ","date":"September 10, 2025","hero":"/events/250910-d365ppugnational/large.png","permalink":"https://techtweedie.github.io/events/250910-d365ppugnational/","summary":"","tags":["Power Platform","Playwright","DevOps","Automated Testing"],"title":"Playwright for Makers – Bringing One-Click UI Tests to Model-Driven Apps \u0026 Power Pages"},{"categories":["How to"],"contents":"Instantly Document Your Power Platform Solutions with Azure DevOps Are you tired of manually documenting your Power Platform solutions? Struggling to keep your tables, relationships, option sets, and workflows up to date for your team or clients? The Power Platform Documentation Extension for Azure DevOps is designed to solve these problems—automatically. This tool helps you generate clear, accurate, and SEO-friendly Markdown documentation for your wikis and repositories, saving you hours and improving collaboration.\nWhy Documentation Matters Documentation is often overlooked in the development lifecycle. Without it, teams can struggle to understand:\nWhat tables, relationships, or option sets are in a solution. Which roles have which privileges. How workflows are configured and triggered. This extension removes that pain by automating the process. Run it as part of your pipeline, and instantly get up-to-date documentation every time you export your solutions.\nInstalling the Extension Head over to the Azure DevOps Marketplace. Search for Power Platform Documentation Extension. Click Get it free and choose your DevOps organisation. If you’re not an admin, you can still request installation from your administrator. Once approved, the extension will be available in your organisation. Adding Documentation to Your Pipeline To use the extension, add it as a step in your pipeline:\nExport and unpack your solution (using your existing tasks). Add a documentation step, for example: Generate Table Documentation Generate Roles Documentation Output files as single Markdown or split by table. Point the extension to your unpacked solution location and your wiki/output folder. Once the pipeline runs, you’ll see a new folder (for example /wiki/solutions/YourSolutionName) containing your Markdown files.\nExample Output After running the extension, you’ll get Markdown tables like this:\nColumn Name Type Description test1 String (no description) test2 String Example column Open these files in preview mode in Azure DevOps and you’ll see clean, well-structured documentation.\nFeatures Available The extension currently supports:\nTable Documentation Generator Entity Relationship Diagrams (ERD) Option Set Documentation Roles and Privileges Documentation Workflow Documentation (currently in preview) Solution Manifest Documentation Each feature generates Markdown output that you can store in a wiki, repository, or publish as part of your CI/CD pipelines.\nWhy Choose This Extension? Automated, always up-to-date documentation—no more manual edits. SEO-friendly Markdown for better search visibility in wikis and repos. Easy integration with Azure DevOps pipelines. Supports ERDs, roles, option sets, and more for complete solution coverage. Saves time and reduces errors for teams and solo makers alike. Where to Get It Official product page — detailed docs \u0026amp; screenshots Visual Studio Marketplace listing — install for your Azure DevOps org Get Started — Quick Steps Install the extension from the Marketplace link above. Add a pipeline step after your solution export/unpack task that points to the unpacked solution folder. Configure output options (single Markdown file or split by table) and an output path (for example: /wiki/solutions/YourSolutionName). Run your pipeline and review the generated Markdown in the output location or in your DevOps wiki. If you want help integrating the extension into an existing pipeline, drop a comment below or reach out via the links on the product page.\n👉 Try it today and let your documentation keep pace with your development. If you found this useful, please share with your network or leave feedback below!\n","date":"September 4, 2025","hero":"/posts/250904-power-platform-documentation-extension/large.png","permalink":"https://techtweedie.github.io/posts/250904-power-platform-documentation-extension/","summary":"","tags":["Power Platform","DevOps","Documentation","Azure DevOps Extension"],"title":"Generate Power Platform Solution Documentation with Azure DevOps"},{"categories":["Power Platform Clinic"],"contents":"Auto-Tag DevOps Tickets via Email using Power Automate In this episode of the Power Platform Clinic, we explore how to keep Azure DevOps tickets updated when discussions happen over email.\nWhy It’s Handy You often need to chase suppliers or clients via email. This approach helps thread those email updates directly into DevOps using plus addressing and Power Automate.\nWhat You’ll Need A shared mailbox that supports plus addressing Power Automate Azure DevOps API 🗺️ Flow Overview 🔧 Key Expressions Extract Recipient Email (Uppercase \u0026amp; Split) @toUpper(triggerOutputs()?[\u0026#39;body/toRecipients\u0026#39;]) @split(outputs(\u0026#39;GET_ID0\u0026#39;),\u0026#39;;\u0026#39;) Filter to Find +Address @contains(item(),\u0026#39;DUNCAN.BOYNE\u0026#39;) Extract Work Item ID @substring( outputs(\u0026#39;EMAIL\u0026#39;), add( indexOf(outputs(\u0026#39;EMAIL\u0026#39;), \u0026#39;+\u0026#39;), 1 ), sub( indexOf(outputs(\u0026#39;EMAIL\u0026#39;), \u0026#39;@\u0026#39;), add( indexOf(outputs(\u0026#39;EMAIL\u0026#39;), \u0026#39;+\u0026#39;), 1 ) ) ) HTTP POST to Azure DevOps { \u0026#34;workItemId\u0026#34;: \u0026#34;@{outputs(\u0026#39;GET_ID\u0026#39;)}\u0026#34;, \u0026#34;text\u0026#34;: \u0026#34;@{body(\u0026#39;Html_to_text\u0026#39;)}\u0026#34; } ✅ Result This setup lets you:\nEmail using plus addressing like support+123@domain.com Extract the DevOps ticket number Convert email body to plain text Post the content as a comment to the DevOps item Wrap-Up It’s a simple, powerful way to ensure nothing gets missed—especially when your team lives in Outlook more than DevOps.\nWant the full expressions and examples? Stay tuned for the downloadable version, or get in touch.\nDownload the Solution \u0026amp; Flow You can download the ready-to-import Power Platform solution and the Leglsey Power Automate flow from the public GitHub repository:\nhttps://mars.mightora.io/yourls/250720github\nThe /solutions/ folder contains the solution ZIP files (managed and unmanaged). The /flows/ folder contains the Leglsey Power Automate flow ZIP file. See the repository README for import instructions. 🎥 Watch the video: https://mars.mightora.io/yourls/250720yt\n❓ Got a question? Submit it here: https://powerplatformclinic.github.io\n","date":"July 20, 2025","hero":"/podcasts/power-platform-clinic/250720-power-platform-clinic-episode-seven/lg.jpg","permalink":"https://techtweedie.github.io/podcasts/power-platform-clinic/250720-power-platform-clinic-episode-seven/","summary":"","tags":["Power Platform","DevOps","Power Automate"],"title":"Power Platform Clinic Episode 7: Automatically Tag DevOps Tickets by Email"},{"categories":["How to"],"contents":"🧪 Playwright + Power Platform + DevOps Pipelines? Yes please! Just dropped a new video where I walk through the latest update to my Playwright for Power Platform DevOps Extension. This time, we’re not just running tests — we’re dynamically assigning roles, business units, and teams to a test user before your Playwright test runs, and then removing them right after. Clean, controlled, reusable testing every time ✅\n🎯 What\u0026rsquo;s new in this release? Assign security roles dynamically in a DevOps pipeline Switch business units and teams for your test user Clean up roles after the test completes Fully open-source DevOps task ready to install Compatible with any Playwright tests against Power Platform This means fewer test users, more test coverage, and pipelines you can actually trust.\n📺 Watch the video now: https://youtu.be/lFHQ8HUsnMI\n🌐 Get the extension from Mightora.io: https://mightora.io\n💬 Let me know how you\u0026rsquo;re testing Power Platform apps!\n🔧 Sample DevOps Pipeline Task - task: mightoria-playwrightForPowerPlatformAdvanced@1 inputs: testLocation: \u0026#39;$(System.DefaultWorkingDirectory)/PlaywrightTests\u0026#39; browser: \u0026#39;chromium\u0026#39; trace: \u0026#39;on\u0026#39; outputLocation: \u0026#39;$(System.DefaultWorkingDirectory)\u0026#39; appUrl: \u0026#39;https://techtweedie.crm11.dynamics.com/main.aspx?appid=6653f9fc-b74b-f011-877a-6045bd0e2fc6\u0026#39; appName: \u0026#39;MDA Playwright Testing\u0026#39; o365Username: \u0026#39;playwright-test@Tweed.technology\u0026#39; o365Password: \u0026#39;$(o365Password)\u0026#39; tenantId: \u0026#39;63759d9f-bfca-4f52-ae98-8f2f1d7bc173\u0026#39; dynamicsUrl: \u0026#39;techtweedie.crm11.dynamics.com\u0026#39; clientId: \u0026#39;db808651-052b-4fc1-83da-ac5149066043\u0026#39; clientSecret: \u0026#39;$(ClientSecret)\u0026#39; userRole: \u0026#39;System Administrator\u0026#39; team: \u0026#39;orgbfc42920\u0026#39; businessUnit: \u0026#39;orgbfc42920\u0026#39; #PowerPlatform #DevOps #Playwright #Testing #Automation #Mightora #ModelDrivenApps #AzureDevOps\n","date":"July 15, 2025","hero":"/posts/250715-running-advanced-playwright-tests-in-azure-devops-for-power-platform/thumbnail.png","permalink":"https://techtweedie.github.io/posts/250715-running-advanced-playwright-tests-in-azure-devops-for-power-platform/","summary":"","tags":["Power Platform","DevOps","Playwright","Testing"],"title":"Running ADVANCED Playwright Tests in Azure DevOps for Power Platform Apps"},{"categories":["Power Platform Clinic"],"contents":"Power Platform Clinic Episode 6 – Certs, Consultancy, and Career Conversations In this episode of the Power Platform Clinic, Duncan and I get stuck into some big-picture questions from the community—about Microsoft certifications, life as a consultant, and how to grow your career across the Power Platform and beyond.\n🎓 Are Microsoft Certifications Worth It? Spoiler: Yes, but with caveats.\nI share how I started collecting Microsoft certs back when they were being offered free (shoutout to the days of the 900 series!). But it’s not about collecting badges—certs give you structure, breadth, and that theoretical grounding that helps when you’re diving into something new.\n\u0026ldquo;Certifications are useful, but experience trumps all. Ideally, you want both.\u0026rdquo;\nWe talk through the differences between the PL-200 and PL-400, and why broader learning (outside of your usual lane) is such a smart investment—especially when it helps you understand your colleagues’ roles.\n💼 Day in the Life of a Consultant Consultancy looks different depending on where you sit.\nI share what it’s like working at a global consultancy like Capgemini, from running projects with deep Azure integration and Power Platform DevOps, to mentoring devs and liaising with senior stakeholders. We chat about coaching vs mentoring, wearing different hats, and being the go-to expert in your domain.\nWe also break down the three ways we all get paid:\n💰 Salary 💚 Satisfaction 📈 Experience And why aiming for two out of three is the sweet spot.\n🧠 More Than Power Platform: Think Strategically This isn’t just about learning Flow expressions and Dataverse tables. It’s also about building skills that elevate your thinking—like Agile, PRINCE2, TOGAF, and Six Sigma.\n\u0026ldquo;Every Agile project lives inside a bigger waterfall project somewhere.\u0026rdquo;\nWe talk about architectural thinking, knowing how to speak the language of the business, and designing with failure (and damage limitation) in mind.\n🎥 Watch the Episode You can watch the full episode on YouTube here: Power Platform Clinic Episode 6\n🧩 Wrapping Up Whether you’re new to the Power Platform, thinking about your next career step, or trying to level up in consultancy, this episode has something for you.\n➡️ Got a question for the next episode? Drop it in the comments on YouTube or tag us on LinkedIn.\n💬 And if you liked this, don’t forget to subscribe and leave a comment—even just to say “Nice shirt” 😆\n📚 Related Content Power Platform Clinic Episode 5 Securing Power Automate for Production How to Set Folder Permissions in SharePoint ","date":"July 13, 2025","hero":"/podcasts/power-platform-clinic/250713-power-platform-clinic-episode-six/large.png","permalink":"https://techtweedie.github.io/podcasts/power-platform-clinic/250713-power-platform-clinic-episode-six/","summary":"","tags":["Certifications","Power Platform","Consultancy"],"title":"Power Platform Clinic - Episode 6: Certs, Consultancy, and Career Conversations"},{"categories":["How to"],"contents":"Introduction Are you looking to easily run Playwright tests for your Power Platform apps within an Azure DevOps pipeline? In this tutorial, I show you how to use my Playwright for Power Platform DevOps extension to run tests against your model-driven app, capture reports, and publish results.\nBelow is the embedded video:\nWhat you will learn How to install the Playwright for Power Platform DevOps extension How to write a Playwright test that creates a Contact record using fake data How to set up your Azure DevOps pipeline to run these tests, archive reports, and publish results Code Snippet – Playwright Test Here is the test script used in the video to add a contact with fake data:\nimport { test, expect } from \u0026#39;@playwright/test\u0026#39;; import \u0026#39;dotenv/config\u0026#39;; interface Config { appUrl: string; appName: string; username: string; password: string; tenantId: string; } const config: Config = { appUrl: process.env.APP_URL || \u0026#39;default_url\u0026#39;, appName: process.env.APP_NAME || \u0026#39;default_name\u0026#39;, username: process.env.O365_USERNAME || \u0026#39;default_username\u0026#39;, password: process.env.O365_PASSWORD || \u0026#39;default_password\u0026#39;, tenantId: process.env.O365_TENANT_ID || \u0026#39;default_tenant_id\u0026#39;, }; let fakeData: { firstName: string; lastName: string; email: string }; test(\u0026#39;add-contact\u0026#39;, async ({ page }) =\u0026gt; { const fetch = (await import(\u0026#39;node-fetch\u0026#39;)).default; const response = await fetch(\u0026#39;https://fakerapi.it/api/v2/custom?_quantity=1\u0026amp;FirstName=firstName\u0026amp;LastName=lastName\u0026amp;Email=email\u0026#39;); const result = await response.json() as { data: Array\u0026lt;{ FirstName: string; LastName: string; Email: string }\u0026gt; }; const user = result.data[0]; fakeData = { firstName: user.FirstName, lastName: user.LastName, email: user.Email, }; await page.goto(config.appUrl); const appTitle = page.locator(`text=\u0026#34;${config.appName}\u0026#34;`).first(); await expect(appTitle).toBeVisible({ timeout: 10000 }); await page.getByText(\u0026#39;AddSpecificResource_16Contacts\u0026#39;).click(); await page.getByRole(\u0026#39;menuitem\u0026#39;, { name: \u0026#39;New\u0026#39;, exact: true }).click(); await page.getByRole(\u0026#39;button\u0026#39;, { name: \u0026#39;dismiss\u0026#39; }).click(); await page.getByRole(\u0026#39;textbox\u0026#39;, { name: \u0026#39;First Name\u0026#39; }).fill(fakeData.firstName); await page.getByRole(\u0026#39;textbox\u0026#39;, { name: \u0026#39;Last Name\u0026#39; }).fill(fakeData.lastName); await page.getByRole(\u0026#39;textbox\u0026#39;, { name: \u0026#39;Email\u0026#39;, exact: true }).fill(fakeData.email); await page.getByRole(\u0026#39;menuitem\u0026#39;, { name: \u0026#39;Save \u0026amp; Close\u0026#39; }).click(); }); Pipeline YAML Here is the Azure DevOps pipeline YAML shown in the video to run your tests, archive reports, publish test results, and commit changes to your repo:\nname: $(TeamProject)_$(BuildDefinitionName)_$(SourceBranchName)_$(Date:yyyyMMdd)$(Rev:.r) trigger: - none pool: vmImage: windows-latest steps: - checkout: self persistCredentials: true clean: true - task: mightoria-playwrightForPowerPlatform@1 inputs: testLocation: \u0026#39;$(System.DefaultWorkingDirectory)/PlaywrightTests\u0026#39; browser: \u0026#39;chromium\u0026#39; trace: \u0026#39;on\u0026#39; outputLocation: \u0026#39;$(System.DefaultWorkingDirectory)\u0026#39; appUrl: \u0026#39;https://techtweedie.crm11.dynamics.com/main.aspx?appid=6653f9fc-b74b-f011-877a-6045bd0e2fc6\u0026#39; appName: \u0026#39;MDA Playwright Testing\u0026#39; o365Username: \u0026#39;playwright-test@Tweed.technology\u0026#39; o365Password: \u0026#39;$(o365Password)\u0026#39; - task: ArchiveFiles@2 inputs: rootFolderOrFile: \u0026#39;$(System.DefaultWorkingDirectory)/playwright-report\u0026#39; includeRootFolder: true archiveType: \u0026#39;zip\u0026#39; archiveFile: \u0026#39;$(System.DefaultWorkingDirectory)/playwright-report/playwright-report.zip\u0026#39; replaceExistingArchive: true - publish: $(System.DefaultWorkingDirectory)/playwright-report/ artifact: playwright-report condition: always() - task: PublishTestResults@2 inputs: testResultsFormat: \u0026#39;JUnit\u0026#39; testResultsFiles: \u0026#39;**/TEST-*.xml\u0026#39; - task: commitToRepo@2 inputs: commitMsg: \u0026#39;$(Date:yyyyMMdd)$(Rev:.r)\u0026#39; branchName: \u0026#39;main\u0026#39; Why use this approach? ✅ Simple integration – install the free DevOps extension\n✅ Automated testing – validate your app’s functionality on every build\n✅ Reports and trace analysis – diagnose flaky tests or environment issues\n✅ Secure credentials – use pipeline variables for secrets\nConclusion Running Playwright tests against your Power Platform apps in Azure DevOps is now easier than ever with the free extension I’ve released. Start incorporating tests into your pipelines to catch issues early, improve app quality, and increase confidence in every deployment.\nIf you have any questions or want to see how this works with other Power Platform components such as Canvas Apps or Power Pages, leave a comment on the video – I’d be happy to cover those in future posts.\nHappy testing!\n","date":"July 2, 2025","hero":"/posts/250702-running-playwright-tests-in-azure-devops-for-power-platform/thumbnail1.png","permalink":"https://techtweedie.github.io/posts/250702-running-playwright-tests-in-azure-devops-for-power-platform/","summary":"","tags":["Power Platform","DevOps","Playwright","Testing"],"title":"Running Playwright Tests in Azure DevOps for Power Platform Apps"},{"categories":["Power Platform Clinic"],"contents":"Introduction In this episode of the Power Platform Clinic, Ian (that’s me!) tackles a real-world community question:\nIs it possible to convert a Microsoft Form that contains both text and images into a PDF using Power Automate?\nAs usual, we go deep on the technical side—pulling data, parsing responses, handling images, encoding files as base64, generating dynamic HTML, and finally converting it all to a PDF and emailing it out.\nThis is a proof-of-concept walkthrough using standard Power Automate connectors—no premium, no external APIs, and no code hosting needed.\nWhat We Covered 🧾 The Scenario A Microsoft Form allows users to enter text and upload an image. We want to capture the form data and generate a PDF. The image should be embedded directly into the PDF, not linked separately. 🛠️ The Flow Design Trigger: When a form response is submitted. Scope Try-A: Get response details and parse the JSON. HTML Compose: Generate a valid HTML structure using Compose actions, injecting form responses and base64-encoded images. Conversion: Save the HTML to OneDrive, then use the built-in “Convert to PDF” action. Email: Send the final PDF as an attachment. 🖼️ Image Handling This was the tricky part:\nImages uploaded in Microsoft Forms are stored in the responder\u0026rsquo;s OneDrive. We retrieved the file path and then used decodeUriComponent() to convert it into a valid format. From there, we used “Get File Content Using Path” to get the base64 body. That base64 string was embedded directly into an \u0026lt;img\u0026gt; tag inside the HTML. 📄 Building the HTML We built the structure dynamically:\n\u0026lt;h2\u0026gt;Questionnaire\u0026lt;/h2\u0026gt; \u0026lt;table\u0026gt; \u0026lt;tr\u0026gt; \u0026lt;th\u0026gt;Question\u0026lt;/th\u0026gt; \u0026lt;th\u0026gt;Response\u0026lt;/th\u0026gt; \u0026lt;/tr\u0026gt; \u0026lt;tr\u0026gt; \u0026lt;td\u0026gt;Here is question one\u0026lt;/td\u0026gt; \u0026lt;td\u0026gt;[Dynamic Text Response]\u0026lt;/td\u0026gt; \u0026lt;/tr\u0026gt; \u0026lt;tr\u0026gt; \u0026lt;td\u0026gt;Image\u0026lt;/td\u0026gt; \u0026lt;td\u0026gt;\u0026lt;img src=\u0026#34;data:image/png;base64,[Dynamic Base64]\u0026#34; /\u0026gt;\u0026lt;/td\u0026gt; \u0026lt;/tr\u0026gt; \u0026lt;/table\u0026gt; We also made this dynamic by setting the image contentType dynamically using expressions like:\nbody(\u0026#39;Get_file_content_using_path\u0026#39;)?[\u0026#39;$content-type\u0026#39;] This means your flow works whether someone uploads a PNG, JPEG, or GIF.\n✉️ Sending the PDF We wrapped up by converting the HTML to a PDF using OneDrive’s native “Convert File” action and emailing the result with Send Email (V2).\nThe attachment file name was dynamically set using the utcNow() expression, like:\nutcNow() \u0026amp; \u0026#39;.pdf\u0026#39; 🧪 Proof of Concept Success We tested the end-to-end flow live:\nMicrosoft Form submitted with text and image ✅ PDF generated with embedded base64 image ✅ Email sent with correct attachment ✅ Flow structured using scopes for clear error handling (Try A, Try B, Try C) ✅ Key Learnings 🧠 Working with Microsoft Forms and file uploads can be fiddly—but Power Automate handles it with some clever expressions. 🛠️ Always structure flows using Scopes. It helps make error handling easier and documentation cleaner. 📩 Embedding images as base64 in HTML is a great trick when sending visual PDFs. Coming Next Stay tuned for the next Power Platform Clinic! As always, if you’ve got questions, submit them at:\n🔗 https://powerplatformclinic.github.io\nOr comment on the video—we’re always on the lookout for the next fun challenge!\nGot a tricky Power Platform scenario? Comment, subscribe, and we might tackle your problem in the next episode!\n","date":"June 25, 2025","hero":"/podcasts/power-platform-clinic/250625-power-platform-clinic-episode-five/large.png","permalink":"https://techtweedie.github.io/podcasts/power-platform-clinic/250625-power-platform-clinic-episode-five/","summary":"","tags":["Power Automate","Forms","HTML","PDF","Power Platform Clinic"],"title":"Power Platform Clinic – Episode 5: Converting Forms with Images to PDF"},{"categories":["How to"],"contents":"Introduction Are you using Entra External ID with Power Pages and want to customise the login experience? This guide walks through exactly that — applying your own branding, images, colours, and layout to the Entra login screen for external users.\nThis is especially useful for creating a polished, consistent experience when your users are accessing Power Pages from outside your organisation.\nIf you’re new to setting up Entra External ID for Power Pages, I’ve also got a full video walkthrough on how to configure that — I’ll link it below.\nWhat You\u0026rsquo;ll Learn In this video and post, we explore:\nHow to verify you\u0026rsquo;re in the correct Entra tenant for external ID customisation Where to find Company Branding settings in Microsoft Entra How to add: Favicon Background image Custom page colours Header and footer logos Links for privacy and terms Common pitfalls (like image size limits) How to test your branding changes in a live Power Pages scenario Why This Matters The default login experience in Entra External ID works — but let’s face it — it’s pretty bland. For public-facing websites or citizen-facing portals, you want something that reinforces trust and shows attention to detail. Custom branding allows you to:\nMatch your corporate style Reduce user confusion Deliver a consistent login experience from start to finish Tips to Remember Images like background and logo must be under 300KB. You’ll need the Organizational Branding Administrator role (or higher). Don’t forget to test in the correct tenant — you’ll usually have a separate External tenant in Entra for this. Email templates are not controlled here — we’ll look at that in another post. Final Thoughts This small step makes a huge difference in how professional your solution feels to users. Whether it’s a public sector form, citizen portal, or partner landing page, getting your login screen branded properly is worth the effort.\nLet me know if you have any questions or want a hand setting up your Entra branding!\n🔗 Watch on YouTube: Walk through Power Pages with Entra External ID\n🔧 Need help? Contact me via iantweedie.biz\nOriginally posted on TechTweedie, going beyond the low code, my home for all things Power Platform and low-code DevOps.\n","date":"June 24, 2025","hero":"/posts/250624-power-pages-customise-entra-external-id/thumbnail-1.png","permalink":"https://techtweedie.github.io/posts/250624-power-pages-customise-entra-external-id/","summary":"","tags":["Power Pages","EntraID","External ID","Branding","Authentication","Power Platform"],"title":"Walkthrough Customising Entra External ID for Power Pages"},{"categories":["Power Platform Clinic"],"contents":"Introduction Episode 2 of the Power Platform Clinic with Duncan Boyne and Ian Tweedie.\nIn this episode, Duncan and Ian dive deep into:\n🔐 Power Pages Authentication Options Ian breaks down:\nThe difference between public and private modes in Power Pages. How Entra ID, External Entra ID, and Azure B2C work for authenticating users. Why using Azure B2C or External Entra ID is often preferred, including centralised identity management, SIEM logging, and reducing password management headaches. How these options relate to OAuth and OIDC protocols for enabling social logins like LinkedIn or Google. Key takeaway: Understanding authentication options is crucial for designing secure Power Pages implementations that fit your organisation\u0026rsquo;s user landscape, whether purely internal, external, or hybrid.\n👨‍💻 Setting Up Your Own Developer Account Duncan and Ian guide you through:\nUsing the Microsoft 365 Developer Program to get a free developer tenant (noting tightened eligibility in recent years). Alternative approaches including: Using Visual Studio benefits if you have them. Starting with an Office 365 trial to activate Power Platform developer environments. Keeping your tenant active by buying a minimal license (e.g. Exchange Online Plan 1) to ensure continuity. Creating your Power Platform developer environment for experimenting with solutions, apps, and portals. They demonstrate step-by-step how to set up a trial tenant, activate the developer license, and spin up your first environment for Power Platform learning and testing.\n💡 Why This Matters Whether you are:\nBuilding your first Power Pages portal. Testing authentication patterns for a client project. Exploring Power Platform features without risking production environments. … having your own developer environment and understanding authentication is essential for safe, efficient experimentation.\n🔗 Related Resources Microsoft 365 Developer Program Power Pages documentation Power Platform Clinic YouTube Channel 🙌 Connect with us If you have questions you’d like us to cover in a future clinic, submit them here.\nThanks for watching, and don’t forget to like, share, and subscribe to support the Power Platform community!\nThis blog post complements Power Platform Clinic Episode 2 and summarises the key insights for easy reference.\n","date":"June 3, 2025","hero":"/podcasts/power-platform-clinic/250603-power-platform-clinic-episode-four/large.png","permalink":"https://techtweedie.github.io/podcasts/power-platform-clinic/250603-power-platform-clinic-episode-four/","summary":"","tags":["Power Platform Clinic","Power Pages","Authentication","Developer Account"],"title":"Power Platform Clinic – Episode 4: Power Pages Authentication Options and Developer Accounts"},{"categories":null,"contents":"D365 \u0026amp; Power Platform UG North East – June 2025 Meetup 📅 Date: Thursday, June 26, 2025\n🕕 Time: 6:00 PM – 8:10 PM BST\n📍 Location: Haylofts, 5 Saint Thomas\u0026rsquo; Street, Newcastle upon Tyne\n🎟️ Register Now: https://mars.mightora.io/yourls/d365ppugne2506\nCo-Hosted By This event is proudly co-hosted by:\nAgata Guziur Justin Wilkinson Ian Tweedie Together, we’re bringing the North East community a fantastic evening of learning, sharing, and connection.\nConnect. Learn. Share. We’re excited to invite you to the June 2025 meetup of the D365 \u0026amp; Power Platform User Group North East — an evening of community learning, insight sharing, and connection with local professionals who are passionate about the Microsoft stack.\nWhether you\u0026rsquo;re a seasoned solution architect, a curious maker, or simply Power Platform-curious, you’ll find valuable sessions and great conversation.\nAgenda 18:15 – Welcome, Networking, and Introductions Come early, grab a drink, meet your peers, and get settled.\n18:30 – Grounding AI: Lessons from the Human Brain Speaker: Eswar Prakash\n📌 Session Details \u0026amp; Link: Session Details\nEswar explores the concept of grounding in AI systems — drawing inspiration from human cognition and neuroscience. You’ll gain a unique perspective on how our brains build internal models, and how those insights can help us design more robust, trustworthy AI.\nExpect brain science, future-gazing, and practical ideas you can apply in the world of intelligent automation.\n19:15 – Break, Food \u0026amp; Networking Recharge with refreshments and continue the conversation with fellow attendees.\n19:30 – An (Up-to-Date) Guide on Building Modern Canvas App Components Speaker: Josh Giles\n📌 Session Details \u0026amp; Link: Session Details\nJosh walks through the fundamentals of building reusable Canvas App components, offers design tips for maintainability, and delivers a hands-on demo to bring it all to life.\nPerfect for makers and developers looking to upskill and streamline app creation.\nWhy Attend? 💡 Fresh insights from real-world experts 👥 Connect with the local D365 and Power Platform community 🧠 Learn something new — whether technical, strategic, or architectural 🍕 Free food and drinks — and great company 👉 Reserve Your Spot 🎟️ Click here to register for the event\nSpaces are limited — grab your seat early!\nFor post-event updates, reflections, and session write-ups — check back here or follow along on LinkedIn.\n","date":"June 1, 2025","hero":"/events/250626-d365ppugne/large.png","permalink":"https://techtweedie.github.io/events/250626-d365ppugne/","summary":"","tags":["Power Platform","Power Apps","Power Automate","AI"],"title":"D365 \u0026 Power Platform UG North East – June 2025 Meetup"},{"categories":["Power Platform Clinic"],"contents":"Introduction Welcome back to the Power Platform Clinic, your weekly check-in for diagnosing everyday Power Platform challenges — and fixing them without the consulting invoice.\nWhether you\u0026rsquo;re a maker, a consultant, or someone who just typed \u0026ldquo;Power Automate funny API\u0026rdquo; into Google — you’re in the right place.\nIn this week’s episode (Episode 3!), co-hosted by myself (Tech Tweedie) and Duncan (Power BI Kinda Guy), we tackled two questions you’ve either already asked — or definitely will soon.\n🎥 Watch the Episode Catch the full walkthrough, demos, and our usual healthy dose of banter in the video:\n🤖 Calling a Dad Joke API from Power Automate Ever needed data from an API only to find there\u0026rsquo;s no Power Automate connector?\nIn my segment, I showed how to call the icanhazdadjoke.com REST API — a lightweight, no-auth service for generating random dad jokes.\nWhat I Covered: Testing the endpoint in Postman to check the response. Setting a simple Accept: application/json header. Using HTTP actions in Power Automate to call the API. No authentication, no custom connectors, no nonsense — just a clean, quick integration to liven up your flow.\n📊 Custom Visuals in Power BI with Duncan Been tempted by flashy Power BI visuals but wondered if they’re actually safe?\nDuncan walked through how to bring Power BI custom visuals into your reports safely and effectively.\nWhat Duncan Covered: Where to find custom visuals. Understanding publisher trust and Microsoft certification. Connecting visuals to your data model and formatting options. Custom visuals can level up your dashboards — as long as you know where they came from.\n💬 Got More Power Platform Puzzles? Drop your questions in the comments or use the form on https://powerplatformclinic.github.io/ — we take requests.\nBecause making your job easier shouldn’t involve sacrificing peace of mind, right?\n🩺 Power Platform Clinic — real issues, solved without drama.\n","date":"May 26, 2025","hero":"/podcasts/power-platform-clinic/250526-power-platform-clinic-episode-three/large.png","permalink":"https://techtweedie.github.io/podcasts/power-platform-clinic/250526-power-platform-clinic-episode-three/","summary":"","tags":["Power Automate","Power BI","Custom Connector","REST API","Custom Vision","Power Platform"],"title":"Power Platform Clinic – Episode 3: Calling a Dad Joke API and Custom Visuals in Power BI Clinic"},{"categories":["How to"],"contents":"Introduction Need to enable Entra External ID sign-in for your Power Pages site? Whether you\u0026rsquo;re looking to support customer access or enable collaboration with external users, the combination of Power Pages and Entra ID is a powerful way to ensure secure, identity-driven access to your platform.\nIn this walkthrough video, I demonstrate how to integrate Entra External ID with a Power Pages site, covering:\nSetting up Entra External ID Configuring identity providers Connecting the authentication flow with your Power Pages site Testing the login experience from an external user’s perspective Why Use Entra External ID with Power Pages? Power Pages is ideal for creating public- or partner-facing web portals, but managing access securely is always a challenge. Entra External ID lets you:\nAllow users to sign in with their own credentials (e.g., Microsoft, Google, etc.) Manage users in a scalable and secure way Reduce the need for custom authentication solutions Combining the two ensures a secure, scalable solution for engaging with users outside your organisation.\nWhat You\u0026rsquo;ll Learn in the Video In the video, I take you through the following steps:\nRegistering your Power Pages app with Entra External ID Connecting your Power Pages authentication settings to Entra Testing login flow as an external user By the end of the walkthrough, you’ll have a working Power Pages site that accepts sign-in via Entra ID for external identities—ready for use in customer or partner scenarios.\nFinal Thoughts Integrating Entra External ID with Power Pages is a vital step if you’re building a secure, user-friendly portal for external users. Whether you’re building customer-facing systems, partner portals, or community sites, this walkthrough gives you the building blocks you need.\nAs always, if you have questions or want to explore more advanced scenarios—like custom branding or consent policies—feel free to reach out.\nOriginally posted on TechTweedie, going beyond the low code, my home for all things Power Platform and low-code DevOps.\n","date":"May 25, 2025","hero":"/posts/250524-power-pages-with-entra-external-id/large.png","permalink":"https://techtweedie.github.io/posts/250524-power-pages-with-entra-external-id/","summary":"","tags":["Power Automate","EntraID","Power Platform"],"title":"Walk through Power Pages with Entra External ID"},{"categories":null,"contents":"📅 Date: 22nd May 2025\n📍 Location: Norwich Digital Hub\n🍕 Pizza included!\nSession Overview At the Norfolk Power Platform User Group, I delivered a session titled:\n✨ “Enabling low-code developers to create custom client reports with a helping hand from Azure”\nIn this session, I explored:\nUsing Power Automate to generate custom reports securely. How to leverage an Azure Function as a reverse proxy to enhance security and scalability. An architectural pattern that integrates Dataverse, Power Automate, SharePoint, and Azure Functions to deliver dynamic, client-ready PDFs via secure links. Presentation Slides You can view my full presentation here:\n➡️ View the Presentation Slides\nGitHub Repository The demo and solution shared in this session are available on GitHub:\n➡️ AzureFunction-PowerAutomateProxy Repository\nThis repository includes:\nAn Azure Function HTTP Proxy that authenticates requests and forwards them to your Power Automate Flow. Full support for GET, POST, and OPTIONS methods. Configurable environment variables to avoid hardcoded URLs or keys. Deploy-to-Azure templates for quick deployment into your own environment. Key Features of the Solution ✅ Forwards all headers from the incoming request.\n✅ Adds a custom Flow-Key header (from environment variables).\n✅ Appends query parameters to the external URL dynamically.\n✅ Provides a robust pattern for securely exposing Power Automate flows as web endpoints.\nTry It Out If you’re interested in setting up this pattern for your own client reporting requirements, follow these steps:\nAccess the GitHub repo. Deploy using the Deploy to Azure button provided in the README. Import the Power Platform solutions to your environment. Configure your environment variables with your Logic App or Flow URL and secret key. Test your new secure reverse proxy-enabled flow! Why This Matters Many organisations need to provide custom PDF or HTML reports to clients securely without exposing their backend systems. Combining Power Automate for report generation with Azure Functions for authentication and reverse proxying provides a scalable, secure, and low-code approach to achieve this.\nEvent Registration Link Here is the original event registration link for reference:\n➡️ Event Registration – Norfolk Power Platform User Group\n💡 Want to know more?\nReach out if you’re interested in deploying this pattern in your organisation or have questions about Azure Functions and Power Automate integration.\nThank you to everyone who attended the session. It was great to see so much interest in combining low-code solutions with pro-code architecture to deliver real business value.\n","date":"May 22, 2025","hero":"/events/250522-nppug/image.png","permalink":"https://techtweedie.github.io/events/250522-nppug/","summary":"","tags":["Power Platform","Power Automate","Azure Functions","Community Event"],"title":"At the Norfolk Power Platform User Group: Enabling Low-Code Developers to Create Custom Client Reports with a Helping Hand from Azure – Norfolk Power Platform User Group"},{"categories":["Power Platform Clinic"],"contents":"Introduction Welcome to Episode 2 of the Power Platform Clinic with Duncan and Ian!\nIn this episode:\n✅ Validating Email Domains with Power Automate Ian shares a powerful, free way to validate email domains using Power Automate, showing:\nWhy validating the entire email address is tricky without sending a link How to split out the domain using the split function Using the CheckDomain connector to confirm whether a domain exists and can receive emails How this approach avoids the costs of third-party services and can be used with data from anywhere: Dataverse, Excel, or SharePoint Ian walks through building the flow step by step, demonstrating how it splits the email, checks the domain, and writes the result back to a field in Dataverse to clean up data input.\n✅ Collating Reports into a Power BI App Duncan then dives into Power BI to show:\nHow to create an app to group reports together for easy access Using the app to share reports with the whole organisation or specific groups Tips for embedding reports in Teams for easy mobile access by managers and leadership Key licensing considerations when deploying Power BI apps across your organisation 💡 Key Takeaways Email validation using Power Automate and the CheckDomain connector can significantly improve your data quality without paid tools. Power BI apps make accessing reports easier for end users and simplify sharing across Teams and devices. We hope you find these insights useful for your next project!\nStay tuned for the next episode, where we continue to tackle real-world Power Platform questions from the community.\n👉 Have a question for us? Drop it in the comments below or reach out via Power Platform Clinic GitHub.\nFor more detailed walkthroughs, demos, and free tools, check out other posts on the blog and subscribe to our YouTube channel.\n","date":"May 20, 2025","hero":"/podcasts/power-platform-clinic/250520-power-platform-clinic-episode-two/large.png","permalink":"https://techtweedie.github.io/podcasts/power-platform-clinic/250520-power-platform-clinic-episode-two/","summary":"","tags":["Power Platform Clinic","Power Automate","Power BI","Email Validation"],"title":"Power Platform Clinic – Episode 2"},{"categories":["Power Platform"],"contents":"Introduction Ever wanted to show or hide a tab in a Model-Driven App based on the Status Reason of a record? Maybe you’ve got a section of a form that’s only relevant when a record hits a specific state — like “Approved” or “Rejected”?\nThis post walks you through a reusable JavaScript snippet I’ve shared on GitHub that dynamically toggles tab visibility based on the statuscode field.\nWhy This Is Useful Sometimes users don’t need to see everything all the time. Maybe you’ve got a whole section for post-approval actions — or maybe you want to hide advanced options until a record is in a specific state.\nRather than relying on Business Rules (which can’t target tabs), or overcomplicating things with multiple forms, a bit of client-side JavaScript is just the trick.\nWhat It Does This script listens for changes to the Status Reason field and shows/hides a tab (tab3 in this case) depending on the value:\nIf Status Reason = 1 → hide the tab If Status Reason = 304880001 → show the tab For all other statuses → hide the tab by default You can easily customise the status values and tab name to suit your scenario.\nHow to Use It 1. Download or copy the snippet You can grab the full snippet from GitHub here:\n👉 View on GitHub\nOr copy it below:\n\u0026#34;use strict\u0026#34;; function toggleByStatusReason(executionContext) { console.log(\u0026#34;==== START show/hide Status Reason ====\u0026#34;); // Start log // Get the form context from the execution context const formContext = executionContext.getFormContext(); console.log(\u0026#34;Retrieved form context.\u0026#34;); // Get the Status Reason value (statuscode column) const status = formContext.getAttribute(\u0026#34;statuscode\u0026#34;).getValue(); // Status Reason console.log(`Status Reason value retrieved: ${status}`); // Get the tab to show/hide (tab3) const tab = formContext.ui.tabs.get(\u0026#34;tab3\u0026#34;); if (tab) { console.log(\u0026#34;Tab \u0026#39;tab3\u0026#39; found.\u0026#34;); // Decision-making based on the Status Reason value if (status === 1) { console.log(\u0026#34;Status Reason is \u0026#39;Status One\u0026#39; (value: 1). Hiding tab3.\u0026#34;); tab.setVisible(false); // Hide tab3 } else if (status === 304880001) { console.log(\u0026#34;Status Reason is \u0026#39;Status Two\u0026#39; (value: 304880001). Showing tab3.\u0026#34;); tab.setVisible(true); // Show tab3 } else { console.log(`Status Reason is an unhandled value (${status}). Defaulting to hiding tab3.`); tab.setVisible(false); // Default to hiding tab3 for other values } } else { console.log(\u0026#34;Tab \u0026#39;tab3\u0026#39; not found. No action taken.\u0026#34;); } console.log(\u0026#34;==== END show/hide Status Reason ====\u0026#34;); // End log } function registerOnChangeEvent(executionContext) { console.log(\u0026#34;Registering OnChange event for Status Reason...\u0026#34;); // Get the form context from the execution context const formContext = executionContext.getFormContext(); console.log(\u0026#34;Retrieved form context.\u0026#34;); // Get the Status Reason attribute (statuscode column) const statusAttribute = formContext.getAttribute(\u0026#34;statuscode\u0026#34;); if (statusAttribute) { console.log(\u0026#34;Status Reason attribute found. Registering OnChange event handler.\u0026#34;); // Register the toggleByStatusReason function for the OnChange event statusAttribute.addOnChange(toggleByStatusReason); // Call the function initially to set the correct visibility on form load console.log(\u0026#34;Calling toggleByStatusReason initially to set visibility on form load.\u0026#34;); toggleByStatusReason(executionContext); } else { console.log(\u0026#34;Status Reason attribute not found. OnChange event handler not registered.\u0026#34;); } } 2. Update the script Update the tab name in your version of the javascript script to match the tab name in your model driven app form. 3. Add it to your solution In your solution, navigate to Web Resources. Upload a new JavaScript file (e.g. statusReasonToggle.js). Paste in the code. Publish. 4. Hook it up in the Form Editor Open the form you want to target. Go to Form Properties. Add your new JavaScript Web Resource. Under Events, for the Form OnLoad, call registerOnChangeEvent. Pass in the form context Save and publish the form. 5. Test It Out Now when you load a record, or change the Status Reason, the tab will show or hide automatically based on the logic you’ve defined. Nice and clean.\nNotes This script uses console.log statements throughout — which makes it really easy to troubleshoot using browser dev tools. You’ll need to replace tab3 with the name of the tab on your own form. Adjust the numeric values (1, 304880001) to match your own Status Reason options. Final Thoughts This is one of those tiny bits of polish that makes a big difference for your users. A cleaner interface, less noise, and a better user experience — with just a few lines of JavaScript.\nIf you find this useful, feel free to ⭐ the GitHub repo or fork it for your own snippets!\nIf you’ve got a similar problem or want to do something more complex — drop me a message. Always happy to help.\nHappy building!\n","date":"April 12, 2025","hero":"/posts/250412-show-hide-status-reason/large.png","permalink":"https://techtweedie.github.io/posts/250412-show-hide-status-reason/","summary":"","tags":["JavaScript","Model Driven App","Status Reason","UX"],"title":"Show and Hide Tabs Based on Status Reason in Model Driven Apps"},{"categories":["Power Platform Clinic"],"contents":"Introduction Welcome to the very first episode of Power Platform Clinic with Duncan and Ian.\nIn this episode:\n✅ Topic 1 – Show/Hide Tabs Dynamically in Model Driven Apps Ian walks through how to show or hide a whole tab in a model-driven app based on the value of a field, such as Status Reason. The demo covers:\nWhy you might want to hide/show tabs A full explanation of the JavaScript snippet used How to upload and register the script in your form libraries Gotchas to watch out for (like needing to register the event handler onload and onchange) Testing the functionality and reviewing console logs to understand what\u0026rsquo;s happening under the hood 👉 This approach is especially useful when you only want tabs visible at specific record statuses instead of toggling individual fields.\n✅ Topic 2 – Calculated Columns vs Measures in Power BI Duncan takes over to demystify when to use calculated columns vs measures in Power BI, including:\nHow calculated columns are stored in the model and increase dataset size How measures calculate dynamically based on report context Use cases for each, including slicing, relationships, and dynamic calculations Performance considerations for model refresh times and visuals Best practice tips on organising measures in a dedicated Measures table 🔎 Key takeaway:\nUse calculated columns when you need a static value for slicing or relationships, and measures for dynamic aggregations like KPIs and totals.\n👥 Get Involved Have a Power Platform question you want answered on the Clinic? Drop it in the comments, reach out on LinkedIn, or send a carrier pigeon (we’re flexible).\n🙏 Thanks for watching.\nIf you found this episode helpful, please like, subscribe, and share to support the channel and help others in the community.\nAbout Power Platform Clinic\nA community-driven video series where Duncan Boyne and Ian Tweedie answer real questions from the Power Platform world, covering everything from Model Driven Apps to Power BI to DevOps and beyond.\n","date":"April 12, 2025","hero":"/podcasts/power-platform-clinic/250412-power-platform-clinic-episode-one/large.png","permalink":"https://techtweedie.github.io/podcasts/power-platform-clinic/250412-power-platform-clinic-episode-one/","summary":"","tags":["Power Platform","Power Automate","Power BI","Model Driven Apps"],"title":"Power Platform Clinic Episode 1"},{"categories":["Power Platform","Power Apps","Power Automate","Community Event","Model Driven Apps","Power Pages","Microsoft 365"],"contents":"D365 \u0026amp; Power Platform UG North East – July 2025 Meetup\n📅 Date: Thursday, 27th March 2025\n🕕 Time: 6:30 PM – 8:30 PM BST\n📍 Location: BJSS, 12th Floor, Bank House, Newcastle upon Tyne NE1 6SQ 🎟️ Event Link: Meetup\nCo-Hosted By This event is proudly co-hosted by:\nAgata Guziur Justin Wilkinson Ian Tweedie Together, we’re bringing the North East community a fantastic evening of learning, sharing, and connection.\nConnect. Learn. Share. The March 2025 meetup of the D365 \u0026amp; Power Platform User Group North East brought insightful sessions, practical demos, and honest community conversations.\nWhether you\u0026rsquo;re a solution architect, functional consultant, developer, or Power Platform-curious, this meetup offered something for everyone.\nAgenda 18:30 – Welcome, Networking, and Introductions Attendees arrived, grabbed food and drinks, and settled in for the evening’s sessions.\n19:00 – The limitations and Business Rules in a Model-driven app and how would you overcome them Speaker: Andrew Wolfe\nAndrew shared:\nAnswers he looks for when interviewing Functional Consultants, Technical Consultants, and Solution Architects Using Business Rules alone vs. leveraging JavaScript for more powerful, maintainable automation A practical demo journey showing how to simplify documentation, enhance functionality, and reduce reliance on large sets of Business Rules 19:35 – Break, Food \u0026amp; Networking An opportunity to discuss the first session and catch up with local community members.\n19:50 – Power Pages Meets GDS: Challenges, Solutions, and Lessons Learned Speaker: Steve Middleton\nSteve explored:\nAdapting Power Pages sites to meet Government Digital Service (GDS) standards Styling challenges with subgrids and forms, and the workarounds used to achieve compliance A live demonstration of a multi-step “Check Your Answers” form Lessons learned when aligning Power Pages implementations with GDS design patterns Why Attend? 💡 Learn from real-world experts 👥 Connect with the local D365 and Power Platform community 🧠 Gain practical tips and strategic insights 🍕 Enjoy free food, drinks, and great conversation 👉 Join the Next Meetup Keep an eye on upcoming events to continue learning and networking with the community.\nFor reflections, write-ups, and event announcements, check back here or follow along on LinkedIn.\n","date":"March 27, 2025","hero":"/events/250327-d365ppugne/large.png","permalink":"https://techtweedie.github.io/events/250327-d365ppugne/","summary":"","tags":["Power Platform","Power Apps","Power Automate","Community Event","Model Driven Apps","Power Pages","Microsoft 365"],"title":"D365 \u0026 Power Platform UG North East – March 2025 Meetup"},{"categories":["Power Platform","Power Automate","Azure Functions","Community Event"],"contents":"📅 Date: 25th March 2025\n📍 Location: SWG3 Glasgow (100 Eastvale Pl, Stobcross Rd, Glasgow City)\n🕕 Time: 6:00 PM to 9:00 PM GMT\nSession Overview At the Scottish Power Platform User Group, I delivered a session titled:\n✨ “Enabling low-code developers to create custom client reports with a helping hand from Azure”\nIn this session, I explored:\nUsing Power Automate to generate custom reports securely. How to leverage an Azure Function as a reverse proxy to enhance security and scalability. An architectural pattern that integrates Dataverse, Power Automate, SharePoint, and Azure Functions to deliver dynamic, client-ready PDFs via secure links. Presentation Slides You can view my full presentation here:\n➡️ View the Presentation Slides\nGitHub Repository The demo and solution shared in this session are available on GitHub:\n➡️ AzureFunction-PowerAutomateProxy Repository\nThis repository includes:\nAn Azure Function HTTP Proxy that authenticates requests and forwards them to your Power Automate Flow. Full support for GET, POST, and OPTIONS methods. Configurable environment variables to avoid hardcoded URLs or keys. Deploy-to-Azure templates for quick deployment into your own environment. Key Features of the Solution ✅ Forwards all headers from the incoming request.\n✅ Adds a custom Flow-Key header (from environment variables).\n✅ Appends query parameters to the external URL dynamically.\n✅ Provides a robust pattern for securely exposing Power Automate flows as web endpoints.\nTry It Out If you’re interested in setting up this pattern for your own client reporting requirements, follow these steps:\nAccess the GitHub repo. Deploy using the Deploy to Azure button provided in the README. Import the Power Platform solutions to your environment. Configure your environment variables with your Logic App or Flow URL and secret key. Test your new secure reverse proxy-enabled flow! Why This Matters Many organisations need to provide custom PDF or HTML reports to clients securely without exposing their backend systems. Combining Power Automate for report generation with Azure Functions for authentication and reverse proxying provides a scalable, secure, and low-code approach to achieve this.\nEvent Registration Link Here is the event registration link:\n➡️ Event Registration – Scottish Power Platform User Group\nThis event was proudly presented by the Scottish Power Platform User Group featuring sessions from community superstars Ian Tweedie and Duncan Boyne.\n","date":"March 25, 2025","hero":"/events/250325-sppug/image-2.png","permalink":"https://techtweedie.github.io/events/250325-sppug/","summary":"","tags":["Power Platform","Power Automate","Azure Functions","Community Event"],"title":"At the Scottish Power Platform User Group: Enabling Low-Code Developers to Create Custom Client Reports with a Helping Hand from Azure – Scottish Power Platform User Group"},{"categories":["How to"],"contents":"Introduction Want to grand and remove item or folder permissions in SharePoint using Power Automate with a Service Principle but can\u0026rsquo;t do it? Not able to do it with the out of the box SharePoint Connector?\nTired of dealing with Conditional Access Policies, password expirations, and frequent logins?\nGood news! In this guide, I\u0026rsquo;ll show you how to set these using an App Registration. No service accounts, no password headaches. I\u0026rsquo;ll also share a Custom Connector to make it even easier, plus step-by-step instructions to:\nCreate and configure the App Registration Assign API permissions Setting permissions using sites.selected Use Power Automate with the custom connector Whats the problem Let\u0026rsquo;s have a look at the problem we are trying to solve. We are going to go into Power Automate Flow and see if we can edit, add, or remove a SharePoint Permission using the Standard SharePoint Connector.\nAs we can see it is not possible.\nTo solve this problem I am going to share with you a connector I have written and released that is free to use, and show you how to set it up and use it.\nCreate the App Registration For this section you will need the help of a Global Administrator\nStep 1 - Log in to Entra By default most users will have the ability to go to https://entra.microsoft.com and log in with your user account.\nStep 2 - Create the app registration Next we are going to create an application registration so our Power Automate flow can talk directly to SharePoint Online.\nWhen you are on the Entra Overview page, open up Identity. Then navigate to Applications in the left hand menu and then open up App Registrations. From there select New Registration. A new screen will open, give your new app registration a name and then click next. Step 3 - Add API permission We then need to give our App Registration an API Permission, This will be the permission used to talk to Exchange Online.\nIn the left hand menu click on API Permission. Click on Add permission. A window will then open, select Application Permission. Then using the search box type in sites.selected. The list will then filter, as it does open up the Mail option and select sites.selected. Step 4 - Grant Admin consent If the Grant admin consent is grayed out then this is because we lack administrative roles for our account. For this next step you will need a some help from a Global Admin.\nAsk a Global administrator to grant admin consent for you if you cant do it.\nSetting your application permissions in SharePoint Online Our next step is to set what permissions the application has within SharePoint. This is different to Exchange where you are limiting the default permission, with sites.selected we have to explicitly give permission rather than specifically limiting it. This Stage will need to be done by someone who is an Owner of the SharePoint site using Microsoft Graph.\nFurther reading: Further information on the API call can be found here.\nStep 1 - Get SharePoint Site ID Navigate to your SharePoint Site, e.g. https://tweedtech.sharepoint.com/sites/TechTweedieDemoSite1 and then just add /_api/site?$select=Id on the end.\nFor example this would give you the URL https://tweedtech.sharepoint.com/sites/TechTweedieDemoSite1/_api/site?$select=Id.\nYou will get XML back that looks like this\n\u0026lt;?xml version=\u0026#34;1.0\u0026#34; encoding=\u0026#34;utf-8\u0026#34;?\u0026gt; \u0026lt;entry xml:base=\u0026#34;https://tweedtech.sharepoint.com/sites/TechTweedieDemoSite1/_api/\u0026#34; xmlns=\u0026#34;http://www.w3.org/2005/Atom\u0026#34; xmlns:d=\u0026#34;http://schemas.microsoft.com/ado/2007/08/dataservices\u0026#34; xmlns:m=\u0026#34;http://schemas.microsoft.com/ado/2007/08/dataservices/metadata\u0026#34; xmlns:georss=\u0026#34;http://www.georss.org/georss\u0026#34; xmlns:gml=\u0026#34;http://www.opengis.net/gml\u0026#34;\u0026gt; \u0026lt;id\u0026gt;https://tweedtech.sharepoint.com/sites/TechTweedieDemoSite1/_api/site\u0026lt;/id\u0026gt; \u0026lt;category term=\u0026#34;SP.Site\u0026#34; scheme=\u0026#34;http://schemas.microsoft.com/ado/2007/08/dataservices/scheme\u0026#34; /\u0026gt; \u0026lt;link rel=\u0026#34;edit\u0026#34; href=\u0026#34;site\u0026#34; /\u0026gt; \u0026lt;title /\u0026gt; \u0026lt;updated\u0026gt;2025-02-13T21:53:24Z\u0026lt;/updated\u0026gt; \u0026lt;author\u0026gt; \u0026lt;name /\u0026gt; \u0026lt;/author\u0026gt; \u0026lt;content type=\u0026#34;application/xml\u0026#34;\u0026gt; \u0026lt;m:properties\u0026gt; \u0026lt;d:Id m:type=\u0026#34;Edm.Guid\u0026#34;\u0026gt;9ce4e8e2-fa87-474b-bd2f-d858d828f8a1\u0026lt;/d:Id\u0026gt; \u0026lt;/m:properties\u0026gt; \u0026lt;/content\u0026gt; \u0026lt;/entry\u0026gt; Locate the ID value, it will look something like this from within the xml\n\u0026lt;d:Id m:type=\u0026#34;Edm.Guid\u0026#34;\u0026gt;9ce4e8e2-fa87-474b-bd2f-d858d828f8a1\u0026lt;/d:Id\u0026gt; We need the Site ID value which in this example is 9ce4e8e2-fa87-474b-bd2f-d858d828f8a1\nStep 2 - Give Permission to our App Registration for this site The first stage is to give the permission to our app permissions on the SharePoint site, to do this we need to use Graph Explorer. You will also need to be an Owner of the SharePoint Site you are giving permissions to.\nStep 1 - Access MS Graph Explorer Go to https://developer.microsoft.com/en-us/graph/graph-explorer Sign in to your account. Make sure the Tenant shows your tenant or company name. Then Press Run query to make sure you are connected. Step 2 - Set permissions The next stage is to run the Create permission command for the app registration to give tit the necessary permissions to be able to set permissions for files and folders.\nTo do this we need to;\nChange the method to POST.\nEnter in the following address https://graph.microsoft.com/v1.0/sites/{sitesId}/permissions, replacing the {siteID} with your siteID. e.g. https://graph.microsoft.com/v1.0/sites/9ce4e8e2-fa87-474b-bd2f-d858d828f8a1/permissions.\nClick on Modify permissions and open the permissions panel.\nSearch for site, and then open up Sites.\nClick on Consent for Sites.FullControl.All. This will open up a consent screen.\nThen in Request body enter in the below json replacing {clientID} and {displayName} with your applications details.\n{ \u0026#34;roles\u0026#34;: [\u0026#34;write\u0026#34;], \u0026#34;grantedToIdentities\u0026#34;: [ { \u0026#34;application\u0026#34;: { \u0026#34;id\u0026#34;: \u0026#34;{clientID}\u0026#34;, \u0026#34;displayName\u0026#34;: \u0026#34;{displayName}\u0026#34; } } ] } e.g.\n{ \u0026#34;roles\u0026#34;: [\u0026#34;write\u0026#34;], \u0026#34;grantedToIdentities\u0026#34;: [ { \u0026#34;application\u0026#34;: { \u0026#34;id\u0026#34;: \u0026#34;229df885-246d-4b6f-8280-267e51f9dc65\u0026#34;, \u0026#34;displayName\u0026#34;: \u0026#34;SharePoint - Demo1 - PowerAutomate\u0026#34; } } ] } You will then get the response like the below\n{ \u0026#34;@odata.context\u0026#34;: \u0026#34;https://graph.microsoft.com/v1.0/$metadata#sites(\u0026#39;9ce4e8e2-fa87-474b-bd2f-d858d828f8a1\u0026#39;)/permissions/$entity\u0026#34;, \u0026#34;id\u0026#34;: \u0026#34;aTowaS50fG1zLnNwLmV4dHwyMjlkZjg4NS0yNDZkLTRiNmYtODI4MC0yNjdlNTFmOWRjNjVANjM3NTlkOWYtYmZjYS00ZjUyLWFlOTgtOGYyZjFkN2JjMTcz\u0026#34;, \u0026#34;grantedToIdentitiesV2\u0026#34;: [ { \u0026#34;application\u0026#34;: { \u0026#34;displayName\u0026#34;: \u0026#34;SharePoint - Demo1 - PowerAutomate\u0026#34;, \u0026#34;id\u0026#34;: \u0026#34;229df885-246d-4b6f-8280-267e51f9dc65\u0026#34; } } ], \u0026#34;grantedToIdentities\u0026#34;: [ { \u0026#34;application\u0026#34;: { \u0026#34;displayName\u0026#34;: \u0026#34;SharePoint - Demo1 - PowerAutomate\u0026#34;, \u0026#34;id\u0026#34;: \u0026#34;229df885-246d-4b6f-8280-267e51f9dc65\u0026#34; } } ] } Download and testing the connector Step 1 - Find custom connectors First we need to find custom connectors in Power Automate.\nTo do this we need to:\nNavigate to https://make.powerautomate.com/ Change our environment should you need to. Then in the left hand menu, navigate to More and then Discover all, and then locate Custom connectors. Step 2 - Create a new connector Click on New custom connector. Then click on Import an OpenAPI from URL. Then enter in the connector name SharePoint with Graph. Then enter in the URL https://raw.githubusercontent.com/itweedie/PowerPlatform-PowerAutomate-SharePoint-with-Graph-Connector/refs/heads/main/connector/shared_sharepoint-20with-20graph-5fbb1338f75d4745cb-5f8d99aea54e2a1a34/apiDefinition.swagger.json Step 3 - Configure your connector Click on to the Security tab. Make sure we are using OAuth 2.0 Make sure the Identity Provider is set to Azure Active Directory and that Enable Service Principle support is ticked. Click in to Client ID Navigate back to Entra and locate your App Registration. Copy the Client ID and paste it in to the Client ID box on the Custom Connector. Go back to the App Registration in Entra, and click on Certificates \u0026amp; secrets, then click on New client secret. Choose a name and a reasonable date for expiry that fits within your organisations policy\u0026rsquo;s. Copy the Secret value, NOT Secret ID, and paste it in to your connector. You will need your secret ID one more time so keep the Entra page open with it on. Enter in Resource URL as https://graph.microsoft.com. Click Create Step 5 - Add your first connection Click on to Test. Then click on to New connection. You should then get a screen which lets your choose Service Principle, if you don\u0026rsquo;t repeat step 3. Then click Create Connection. Enter in your Secret (we do this first as we already have the page open from Step 3). Then enter in your Client ID and Tenant ID. Then click Create Connection. Step 6 - Test get site information Lets see if we can connect to our SharePoint Site using out app registration.\nOn the Test tab click on SiteInfomation. Enter in the details from your site URL. For our demo we are using https://tweedtech.sharepoint.com/sites/TechTweedieDemoSite1/. We will enter in the tenantPrefix, for us this is tweedtech. For site name we will enter in TechTweedieDemoSite1 from the url of our site. For site path we can see from the URL this is sites. Click on Test operation to see if it works. Testing adding a permission You can skip this part if you would like, and go to the next section.\nStep 1 - Test 2 to list drives We are going to test the ability to list site drives. The site drives are the document libraries contained within a SharePoint Site, and we will need the drive ID to set permissions later.\nEnter in your site id, for us this is 9ce4e8e2-fa87-474b-bd2f-d858d828f8a1 and we will get our Drive ID back which for us is b!4ujknIf6S0e9L9hY2Cj4oSy8fRwGRJ9Ghv-lEfn4u6ovHPyydePwRosMG0M4nhQc.\nWe can do this in Power Automate Flow later or save it as an environment variable, or hard code it depending on how we want to use it.\nStep 2 - Let\u0026rsquo;s test adding a folder So we have something to give permission for.\nTo do this lets;\nClick on the AddFolder action. Enter in siteID, for us this is 9ce4e8e2-fa87-474b-bd2f-d858d828f8a1. Let\u0026rsquo;s enter in our driveID which is b!4ujknIf6S0e9L9hY2Cj4oSy8fRwGRJ9Ghv-lEfn4u6ovHPyydePwRosMG0M4nhQc. Then lets give our folder a name, we have to enter this twice. We have got the response back and as part of the response we have an item ID 01LDPBINVTIMOY4Y3TGRDIDMVHU4F3HIZM. We will need the item ID to set the permissions against. Step 3 - Let\u0026rsquo;s add the permission We are now able to test adding the permission.\nClick on to AddItemPermission. Enter in siteID, for us this is 9ce4e8e2-fa87-474b-bd2f-d858d828f8a1. Let\u0026rsquo;s enter in our driveID which is b!4ujknIf6S0e9L9hY2Cj4oSy8fRwGRJ9Ghv-lEfn4u6ovHPyydePwRosMG0M4nhQc. Then our itemID which we have as 01LDPBINVTIMOY4Y3TGRDIDMVHU4F3HIZM. Then we need add in a message. We are going to put in Test Message. We are going to set RequireSignIn to true. We are going to set sendInvitation to true. The email of the user we are going to invite is demo@tweed.technology. Add the permission we are going to give them is read. We can also see we got this through as an email message to.\nStep 4 - Lets try it in a Flow Click on My flows Create a new flow. Add a trigger. Add a new step. From Connector type choose Custom. Then select SharePoint with Graph from the list. Enter in siteID, for us this is 9ce4e8e2-fa87-474b-bd2f-d858d828f8a1. Let\u0026rsquo;s enter in our driveID which is b!4ujknIf6S0e9L9hY2Cj4oSy8fRwGRJ9Ghv-lEfn4u6ovHPyydePwRosMG0M4nhQc. Then our itemID which we have as 01LDPBINVTIMOY4Y3TGRDIDMVHU4F3HIZM. Then we need add in a message. We are going to put in Test Message. We are going to set RequireSignIn to true. We are going to set sendInvitation to true. The email of the user we are going to invite is demo@tweed.technology. Add the permission we are going to give them is write. Test and make sure the email comes through. We can we have got an email through so our user is aware they have been given access to this folder. Lets Demo in a Power Automate Flow Let\u0026rsquo;s demo is in Power Automate flow, so we can set the permissions for a folder we will create.\nStep 1 - Start with a flow to list the drives First we need to List the Drives in our site so we can find the right one.\nAdd a new step. From Connector type choose Custom. Then select SharePoint with Graph from the list. Enter in siteID, for us this is 9ce4e8e2-fa87-474b-bd2f-d858d828f8a1. Step 2 - Filter the drives we got back Our next step is to filter the array of drive\u0026rsquo;s we got back.\nAdd a new step Search for filter. Click on the data operation filter action. Then add the following details; In from select value from the Site Drves step. In the Choose value box select name from the Site Drves step. Enter the name of your document library in our example this is a library called test. Let\u0026rsquo;s test it. What we get back in outputs are the details for the document library which we will use in our next step. Step 3 - Let\u0026rsquo;s test adding a folder So we have something to give permission for.\nTo do this lets;\nAdd a new step. From Connector type choose Custom. Then select SharePoint with Graph from the list. Click on the Add Folder action. Enter in siteID, for us this is 9ce4e8e2-fa87-474b-bd2f-d858d828f8a1. In the Drive ID add in the id from the filter action. Then lets give our folder a name, we have to enter this twice. Step 4 - Let\u0026rsquo;s add the permission We are now able to test adding the permission.\nAdd a new step. From Connector type choose Custom. Then select SharePoint with Graph from the list. Click on to Add Item Permission. Enter in siteID, for us this is 9ce4e8e2-fa87-474b-bd2f-d858d828f8a1. Let\u0026rsquo;s select our drive ID add in the id from the filter action. Let\u0026rsquo;s select our item ID add in the id from the Add Folder action. Then we need add in a message. We are going to put in Test Message. We are going to set Require Sign In to Yes. We are going to set Send Invitation to Yes. The email of the user we are going to invite is demo@tweed.technology. Add the permission we are going to give them is write. Conclusion Congratulations! You have successfully set folder permissions in SharePoint using Power Automate Flow. By following these steps, you can streamline your workflow and ensure that the right people have the appropriate access to your SharePoint folders. This process not only saves time but also enhances the security and management of your documents. Keep exploring the capabilities of Power Automate to further optimize your SharePoint experience. Happy automating!\n","date":"February 17, 2025","hero":"/posts/250217-set-folder-permissions-in-sharepoint-with-power-automate-flow/featureImage.png","permalink":"https://techtweedie.github.io/posts/250217-set-folder-permissions-in-sharepoint-with-power-automate-flow/","summary":"","tags":["Power Automate","EntraID","Power Platform"],"title":"Set folder permissions in SharePoint with Power Automate Flow"},{"categories":["Azure","Power Platform","Power Automate"],"contents":"Introduction Want to send emails from Power Automate but can\u0026rsquo;t or don\u0026rsquo;t want to use a service account? Tired of dealing with Conditional Access Policies, password expirations, and frequent logins? Do you want a simple, secure, and scalable way to send emails without constantly re-confirming security information?\nGood news! In this guide, I\u0026rsquo;ll show you how to send emails directly from Microsoft Exchange using only an App Registration. No service accounts, no password headaches. I\u0026rsquo;ll also share a Custom Connector to make it even easier, plus step-by-step instructions to:\nCreate and configure the App Registration Assign API permissions Restrict sending access to specific mailboxes Use Power Automate with the custom connector Send emails using app registration Create the App Registration For this section you will need the help of a Global Administrator\nStep 1 - Log in to Entra By default most users will have the ability to go to https://entra.microsoft.com and log in with your user account.\nStep 2 - Create the app registration Next we are going to create an application registration so our Power Automate flow can talk directly to Exchange Online.\nWhen you are on the Entra Overview page, open up Identity. Then navigate to Applications in the left hand menu and then open up App Registrations. From there select New Registration. A new screen will open, give your new app registration a name and then click next. Step 3 - Add API permission We then need to give our App Registration an API Permission, This will be the permission used to talk to Exchange Online.\nIn the left hand menu click on API Permission. Click on Add permission. A window will then open, select Application Permission. Then using the search box type in mail.send. The list will then filter, as it does open up the Mail option and select mail.send. NOTE: Yes this permission lets your send emails as anyone, however we will restrict this later. DO NOT MISS USE IT IS POSSIBLE TO TRACK WHERE AN EMAIL CAME FROM.\nStep 4 - Grant Admin consent You will notice that the Grant admin consent is grayed out This is Because we currently lack administrative roles for our account. For this next step you will need a Global Admin.\nAsk a Global administrator to grant admin consent for you Limiting application permissions in Exchange Our next step is to limit what permissions our application will have within exchange, we have already said it will have mail.send however without further limitation that ill allow the application to send emails from any users email account.\nThis Stage will need to be done by an Exchange Online Administrator.\nFurther reading: Further information of these next steps can be found here.\nStep 1 - Set up a Mail Enabled Security Group When we tell exchange that we want to restrict access in some way for an application registration, we need to first create a Mail-enabled Security group.\nThere are a few ways to do this, however today we are going to do it from the Microsoft 365 Admin center.\nGo to Microsoft 365 Admin center Then open up Teams and Groups. Select \u0026lsquo;Security\u0026rsquo; from the menu. Click on Add new security group. Step 2 - Access PowerShell For this next step we are going to use some Powershell. You can do this from a location of your choice however today we are going to demo it from Cloud Shell.\nNavigate to Exchange Online https://admin.cloud.microsoft/exchange#/homepage. Click on the Cloud Shell button in the top right hand corner. Step 3 - Create Application Access Policy Next we are going to create an application access policy using both the Application ID and the Mail-enabled security group we created in earlier steps.\nLets look at our command. New-ApplicationAccessPolicy -AppId b9701c1e-1364-464d-93e4-01ae925e8d6c -PolicyScopeGroupId PowerAutomateTest@Tweed.technology -AccessRight RestrictAccess -Description \u0026quot;Restrict this app to members of PowerAutomateTest@Tweed.technology\u0026quot; Breaking this down we have Command: New-ApplicationAccessPolicy - This cmdlet creates a new application access policy in Microsoft 365. Parameter: -AppId b9701c1e-1364-464d-93e4-01ae925e8d6c - Specifies the unique identifier (AppId) of the application for which the policy is being created. Parameter: -PolicyScopeGroupId PowerAutomateTest@Tweed.technology - Defines the scope of the policy by specifying the group ID (email address) that the policy will apply to. Parameter: -AccessRight RestrictAccess - Sets the access right for the policy. In this case, it restricts access. Parameter: -Description \u0026quot;Restrict this app to members of PowerAutomateTest@Tweed.technology\u0026quot; - Provides a description for the policy, explaining its purpose. Lets try running the command in PowerShell using the CloudShell. Oh no, it doesn\u0026rsquo;t work. You could be forgiven for thinking that given we have opened this up from the Exchange Admin center that we would indeed already have access to and be connected to exchange online within the CloudShell but unfortunately we are not. Therefore before we go any further we need to install the Exchange Online Management Module. Install-Module -Name ExchangeOnlineManagement -Force. Step 4 - Import and Connect Our next step, is really to go back a stage and import and connect to Exchange Online.\nTo do this we need to:\nFirst we need to import the module we have just installed, to do this we run this command Import-Module ExchangeOnlineManagement Next we need to connect to exchange, within the CloudShell the easiest way to to this is by using device login. Run this command Connect-ExchangeOnline -Device. This will give us a URL and a device code in order to log in to Exchange Online. Next lets try re-running our command to create the new policy New-ApplicationAccessPolicy -AppId b9701c1e-1364-464d-93e4-01ae925e8d6c -PolicyScopeGroupId PowerAutomateTest@Tweed.technology -AccessRight RestrictAccess -Description \u0026quot;Restrict this app to members of PowerAutomateTest@Tweed.technology\u0026quot; This time we get the response ScopeName : Power Automate Test ScopeIdentity : Power Automate Test20250209121934 Identity : 63759d9f-bfca-4f52-ae98-8f2f1d7bc173\\b9701c1e-1364-464d-93e4-01ae925e8d6c:S-1-5-21-3787302941-3231517822-469913106-31437838;9 98e9d79-817d-41c9-87d8-d9c07f27f4b2 AppId : b9701c1e-1364-464d-93e4-01ae925e8d6c ScopeIdentityRaw : S-1-5-21-3787302941-3231517822-469913106-31437838;998e9d79-817d-41c9-87d8-d9c07f27f4b2 Description : Restrict this app to members of PowerAutomateTest@Tweed.technology AccessRight : RestrictAccess ShardType : All IsValid : True ObjectState : Unchanged Step 5 - Let\u0026rsquo;s test it in PowerShell We can now test using PowerShell, to see if it\u0026rsquo;s applied correctly.\nTo do this:\nWe are going to run the following command Test-ApplicationAccessPolicy -Identity testABC@Tweed.technology -AppId b9701c1e-1364-464d-93e4-01ae925e8d6c\nIf we break down this command:\nCommand: Test-ApplicationAccessPolicy - This cmdlet tests an application access policy in Microsoft 365 to verify if a user has access. Parameter: -Identity testABC@Tweed.technology - Specifies the identity (email address) of the user to test against the application access policy. Parameter: -AppId b9701c1e-1364-464d-93e4-01ae925e8d6c - Specifies the unique identifier (AppId) of the application for which the policy is being tested. Running the command we get the following response:\nAppId : b9701c1e-1364-464d-93e4-01ae925e8d6c Mailbox : testABC MailboxId : 75283b3b-609a-4c1c-b8b8-baa1342fdfa6 MailboxSid : S-1-5-21-3787302941-3231517822-469913106-31499791 AccessCheckResult : Granted Let\u0026rsquo;s test this against a different email address that is not within the Mail-enabled security group by running Test-ApplicationAccessPolicy -Identity demo@Tweed.technology -AppId b9701c1e-1364-464d-93e4-01ae925e8d6c.\nAppId : b9701c1e-1364-464d-93e4-01ae925e8d6c Mailbox : demo MailboxId : d2ca4050-f8a9-4986-b998-387603b466b6 MailboxSid : S-1-5-21-3787302941-3231517822-469913106-19344836 AccessCheckResult : Denied We can see it has being Denied which is the response we expected.\nDownload and testing the connector Step 1 - Find custom connectors First we need to find custom connectors in Power Automate.\nTo do this we need to:\nNavigate to https://make.powerautomate.com/ Change our environment should you need to. Then in the left hand menu, navigate to More and then Discover all, and then locate Custom connectors. Step 2 - Create a new connector Click on New custom connector. Then click on Import an OpenAPI from URL. Then enter in the connector name Send email using Graph. Then enter in the URL https://raw.githubusercontent.com/itweedie/PowerPlatform-Send-Emails-from-Power-Automate-without-a-Service-Account/refs/heads/main/connector/shared_mightora-5fsend-20mail-20with-20graph-5fe07b0f04a8b0d4c3/apiDefinition.swagger.json Step 3 - Configure your connector Click on to the Security tab. Make sure we are using OAuth 2.0 Make sure the Identity Provider is set to Azure Active Directory and that Enable Service Principle support is ticked. Click in to Client ID Navigate back to Entra and locate your App Registration. Copy the Client ID and paste it in to the Client ID box on the Custom Connector. Go back to the App Registration in Entra, and click on Certificates \u0026amp; secrets, then click on New client secret. Choose a name and a reasonable date for expiry that fits within your organisations policy\u0026rsquo;s. Copy the Secret value, NOT Secret ID, and paste it in to your connector. You will need your secret ID one more time so keep the Entra page open with it on. Enter in Resource URL as https://graph.microsoft.com. Click Create Step 4 - Add C# to process the attachment In order to be able to send attachments we need to add some C# to our connector. If you are not going to be sending attachments you can skip this step.\nTo do this you need to:\nClick on to Code Click to enable code. Copy and pase the below code in to the code box. public class Script : ScriptBase { public override async Task\u0026lt;HttpResponseMessage\u0026gt; ExecuteAsync() { // Read the request content as a string var requestContentAsString = await this.Context.Request.Content.ReadAsStringAsync().ConfigureAwait(false); // Parse the request content string into a JSON object var requestContentAsJson = JObject.Parse(requestContentAsString); // Modify the attachments array if it exists List\u0026lt;string\u0026gt; attachmentFileTypes = new List\u0026lt;string\u0026gt;(); if (requestContentAsJson[\u0026#34;message\u0026#34;]?[\u0026#34;attachments\u0026#34;] is JArray attachments) { foreach (var attachment in attachments) { // Add the @odata.type element attachment[\u0026#34;@odata.type\u0026#34;] = \u0026#34;#microsoft.graph.fileAttachment\u0026#34;; } } // Set the modified JSON back to the request content this.Context.Request.Content = CreateJsonContent(requestContentAsJson.ToString()); // Send the API request and get the response var response = await this.Context.SendAsync(this.Context.Request, this.CancellationToken).ConfigureAwait(continueOnCapturedContext: false); // Read the response content as a string var responseContentAsString = await response.Content.ReadAsStringAsync().ConfigureAwait(false); // Check if the response content is empty or null if (string.IsNullOrEmpty(responseContentAsString)) { // Set a default message if there is no response from the endpoint responseContentAsString = \u0026#34;{\\\u0026#34;message\\\u0026#34;: \\\u0026#34;No response from the endpoint\\\u0026#34;}\u0026#34;; } else { try { // Try to parse the response content string into a JSON object var responseContentAsJson = JObject.Parse(responseContentAsString); // Convert the JSON object back to a string responseContentAsString = responseContentAsJson.ToString(); } catch (JsonReaderException) { // If parsing fails, set an error message with the invalid JSON response responseContentAsString = $\u0026#34;{{\\\u0026#34;message\\\u0026#34;: \\\u0026#34;Invalid JSON response\\\u0026#34;, \\\u0026#34;response\\\u0026#34;: \\\u0026#34;{responseContentAsString}\\\u0026#34;}}\u0026#34;; } } // Make a custom HTTP GET call to the developer messaging API string developerMessage = \u0026#34;Failed to get updated developer message\u0026#34;; try { var request = (HttpWebRequest)WebRequest.Create(\u0026#34;https://developer-message.mightora.io/api/HttpTrigger?appname=send-email-with-graph\u0026#34;); request.Method = \u0026#34;GET\u0026#34;; using (var developerResponse = (HttpWebResponse)request.GetResponse()) { using (var streamReader = new StreamReader(developerResponse.GetResponseStream())) { var developerResponseContent = streamReader.ReadToEnd(); var developerResponseJson = JObject.Parse(developerResponseContent); developerMessage = developerResponseJson[\u0026#34;message\u0026#34;]?.ToString() ?? developerMessage; } } } catch { // If the GET request fails, developerMessage remains as the default failure message } // Create a JSON object to include the original request, the response content, and the developer message var finalResponseContent = new JObject { [\u0026#34;version\u0026#34;] = \u0026#34;1.2.0\u0026#34;, // Add version number here [\u0026#34;responseContent\u0026#34;] = JObject.Parse(responseContentAsString), [\u0026#34;developerMessage\u0026#34;] = developerMessage }; // Set the response content back to the JSON string response.Content = CreateJsonContent(finalResponseContent.ToString()); // Return the response return response; } private bool IsBinary(string content) { // Check if the content contains non-printable characters foreach (char c in content) { if (char.IsControl(c) \u0026amp;\u0026amp; c != \u0026#39;\\r\u0026#39; \u0026amp;\u0026amp; c != \u0026#39;\\n\u0026#39; \u0026amp;\u0026amp; c != \u0026#39;\\t\u0026#39;) { return true; } } return false; } } Step 5 - Add your first connection Click on to Test. Then click on to New connection. You should then get a screen which lets your choose Service Principle, if you don\u0026rsquo;t repeat step 3. Then click Create Connection. Enter in your Secret (we do this first as we already have the page open from Step 3). Then enter in your Client ID and Tenant ID. Then click Create Connection. Step 6 - Test On the Test screen scroll down to Operations. Enter in the following: user-email: An email address you placed in the mail enabled security group earlier message.subject: test message message.body.contentType: HTML message.body.content: test saveToSentItems: true emailAddress.address (note first one only): An email address you would like to send to Then press Test operation Scroll down to Response and you should get a 202. Test and make sure the email comes through. Step 7 - Lets try it in a Flow Click on My flows Create a new flow. Add a trigger. Add a new step. From Connector type choose Custom. Then select Send email using Graph from the list. Fill out the details for the connector that you want to use. Test and make sure the email comes through. Sending attachments If you are planing to use this connector to send attachments, if the file is binary you will need to convert it to base64.\nConclusion: Secure, Scalable Email Sending from Power Automate By following this guide, you’ve successfully set up a secure, scalable way to send emails from Power Automate—without relying on service accounts, password resets, or Conditional Access headaches.\nWith your App Registration configured, permissions locked down, and custom connector deployed, you now have a robust method to send emails directly through Exchange Online—while ensuring access is tightly controlled.\nKey Takeaways: ✅ No Service Account Required – Uses an App Registration instead.\n✅ Secure and Controlled – Email sending is restricted to specific mailboxes.\n✅ Fully Automated – No need to log in or manage passwords.\n✅ Scalable and Future-Proof – Works seamlessly across flows, reducing admin overhead.\nNow that your setup is complete, you can start integrating email automation into your Power Automate workflows with confidence. Give it a try, experiment with different use cases, and let me know how it works for you!\n🚀 Happy automating! 🚀\n","date":"February 10, 2025","hero":"/posts/250210-send-emails-from-flow-without-a-service-account/featureImage.png","permalink":"https://techtweedie.github.io/posts/250210-send-emails-from-flow-without-a-service-account/","summary":"","tags":["Power Automate"],"title":"Send Emails from Flow without a Service Account"},{"categories":["Power Platform","Power Automate","Azure Functions","Community Event"],"contents":"📅 Date: 30th January 2025\n📍 Location: Wesleyan, Colmore Circus Queensway, Birmingham\n🕕 Time: 6:30 PM to 9:00 PM GMT\nSession Overview At the Norfolk Power Platform User Group, I delivered a session titled:\n✨ “Enabling low-code developers to create custom client reports with a helping hand from Azure”\nIn this session, I explored:\nUsing Power Automate to generate custom reports securely. How to leverage an Azure Function as a reverse proxy to enhance security and scalability. An architectural pattern that integrates Dataverse, Power Automate, SharePoint, and Azure Functions to deliver dynamic, client-ready PDFs via secure links. Presentation Slides You can view my full presentation here:\n➡️ View the Presentation Slides\nGitHub Repository The demo and solution shared in this session are available on GitHub:\n➡️ AzureFunction-PowerAutomateProxy Repository\nThis repository includes:\nAn Azure Function HTTP Proxy that authenticates requests and forwards them to your Power Automate Flow. Full support for GET, POST, and OPTIONS methods. Configurable environment variables to avoid hardcoded URLs or keys. Deploy-to-Azure templates for quick deployment into your own environment. Key Features of the Solution ✅ Forwards all headers from the incoming request.\n✅ Adds a custom Flow-Key header (from environment variables).\n✅ Appends query parameters to the external URL dynamically.\n✅ Provides a robust pattern for securely exposing Power Automate flows as web endpoints.\nTry It Out If you’re interested in setting up this pattern for your own client reporting requirements, follow these steps:\nAccess the GitHub repo. Deploy using the Deploy to Azure button provided in the README. Import the Power Platform solutions to your environment. Configure your environment variables with your Logic App or Flow URL and secret key. Test your new secure reverse proxy-enabled flow! Why This Matters Many organisations need to provide custom PDF or HTML reports to clients securely without exposing their backend systems. Combining Power Automate for report generation with Azure Functions for authentication and reverse proxying provides a scalable, secure, and low-code approach to achieve this.\nEvent Registration Link Here is the updated event registration link:\n➡️ Event Registration – Birmingham Power Platform User Group\n💡 Want to know more?\nReach out if you’re interested in deploying this pattern in your organisation or have questions about Azure Functions and Power Automate integration.\nThank you to everyone who attended the session. It was great to see so much interest in combining low-code solutions with pro-code architecture to deliver real business value.\n","date":"January 30, 2025","hero":"/events/250130-bppug/image.png","permalink":"https://techtweedie.github.io/events/250130-bppug/","summary":"","tags":["Power Platform","Power Automate","Azure Functions","Community Event"],"title":"At Birmingham Power Platform User Group: Enabling Low-Code Developers to Create Custom Client Reports with a Helping Hand from Azure – Norfolk Power Platform User Group"},{"categories":["Community"],"contents":"What Would Happen If You Lost Your Dev Environment? Ever wondered what would happen if someone accidentally deleted your development site or overwrote a key flow? Rebuilding from memory isn\u0026rsquo;t just frustrating — it’s costly.\nThis post walks through the process I demonstrated during my session with the Power Platform Learner to Leader community, showing how to automatically back up your environment using Azure DevOps, without requiring advanced YAML knowledge.\n📺 Watch the full session here\nWhy Back Up? We’ve all seen it — shared dev environments, unmanaged solutions sitting idle, a delete button clicked in the wrong place, and poof — it’s gone.\nSetting up a DevOps pipeline gives you version control, rollback, and documentation without relying on memory or screenshots.\nWhat You’ll Need A Power Platform environment (Dataverse) Azure DevOps (free tier is enough!) App Registration with permissions to Dataverse Five free DevOps licenses included per tenant One-time parallelism request (12-hour approval) What the Pipeline Does Here\u0026rsquo;s what we built together, step-by-step:\n✅ Install Power Platform Build Tools 🔐 Create a service connection with your app registration 📦 Export your solution (managed \u0026amp; unmanaged) 🔍 Unpack the solution into a Git repository 📖 Optionally: Export reference data and download Power Pages site 📄 Use my open-source DevOps extension to: Document tables Visualise relationships with Mermaid.js Automatically commit changes to a Git repo ⏰ Schedule it to run multiple times per day This gives you historical snapshots — no more guesswork on \u0026ldquo;what changed and when.\u0026rdquo;\nWhat If You’re Using Dataverse for Teams? I’ve built a special DevOps step that uses device login for backing up Dataverse for Teams environments too. It can’t run on a schedule (due to authentication), but it’s still a lifesaver when exporting and documenting your Teams-based solutions.\nIntroducing: Mightora DevOps Extension This free tool helps you:\n📘 Auto-generate table documentation 🔗 Build relationship diagrams 🧠 Document Canvas Apps 🔄 Push backups to your Git repo 📅 Run backups on a schedule It’s open-source and available via the Visual Studio Marketplace.\nTry It Yourself I’ve published the full presentation to GitHub with links and steps you can follow. Scan the QR code from the session or head over to my GitHub repo to get started.\nNo more “what did I change last week?”\nNo more “I wish I had a backup.”\nStart today — your future self will thank you.\nQuestions or Need a Hand? I love seeing this used in real projects. If you\u0026rsquo;re stuck or want to suggest improvements, ping me on LinkedIn or drop a comment on the YouTube video.\nhttps://lnkd.in/gJJs77rD\n","date":"December 14, 2024","hero":"/posts/241214-backing-up-your-development-environment-avoiding-costly-mistakes-in-low-code-development/thumbnail-2.png","permalink":"https://techtweedie.github.io/posts/241214-backing-up-your-development-environment-avoiding-costly-mistakes-in-low-code-development/","summary":"","tags":["Power Platform","DevOps","Backup","Azure DevOps"],"title":"At Learner to Leader: Avoid Costly Mistakes: Backing Up Your Power Platform Development Environment"},{"categories":["Power Platform","Power Automate","Azure Functions","Community Event"],"contents":"📅 Date: 9th November 2024\n📍 Location: Online\nSession Overview At the Low Code No Code Microsoft Power Platform Conference 2024, I delivered a session titled:\n✨ \u0026ldquo;Enabling low-code developers to create custom client reports with a helping hand from Azure\u0026rdquo;\nMany of us have found a way to deliver custom reports from Dataverse, or other outputs to our clients via low-code tools such as Power Automate. Maybe we have used the Word connector, Dataverse, a bit of SharePoint, and a sprinkle of Outlook to deliver it.\nMaybe we have created a mail merge type flow, but what and when do you trigger it?\nEver wondered if you could get the client to trigger it via a web link?\nEver wondered if you could get the client to trigger it via a web link in the context of themselves?\nEver wondered if you could get the client to trigger it via a web link in the context of themselves, with Single Sign-On?\nIn this presentation, I showed how, with a bit of Azure, you can enable low-code developers to create custom client reports, or any other output you can think of, using Power Automate, whether it be for membership certificates, invoices, or anything else.\nThis presentation is a co-pilot free zone 🙂\nLive Link ➡️ Watch the Presentation\n","date":"November 9, 2024","hero":"/events/241109-lcnc/image.png","permalink":"https://techtweedie.github.io/events/241109-lcnc/","summary":"","tags":["Power Platform","Power Automate","Azure Functions","Community Event"],"title":"Enabling Low-Code Developers to Create Custom Client Reports with a Helping Hand from Azure – Low Code No Code Conference"},{"categories":["Power Platform"],"contents":"Do you need to move a Solution in Dataverse for Teams form one environment to another, are you unable to use a Pipeline to move a solution around? Want easy reputable steps? Need to be able to drill into what changes have taken place between solution builds.\nToday I will show you a tool that I have made to make this process easier by using a PowerShell Script to move Dataverse for Teams solutions between environments.\nWhat you don\u0026rsquo;t need Azure DevOps - however, if you have Azure DevOps I have a pipeline tool currently in Preview for you on the Visual Studio Market Place; you can find it here! What you need Things you need to follow along with this demo, however as you will see you don\u0026rsquo;t need them all to use the tool.\nGit - This is a small programme to help with source control. GitHub Account - If you want to follow the steps exactly, however, you can also just download a copy of the repository. Visual Studio Code) - A free tool, or you can use another IDE of your preference. PowerShell and the ability to run Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Scope CurrentUser Any other tools are covered in the tutorial.\nSteps to use the tool Step 1 - Copy the script In this example we are going to use this repo: Power-Platform-Dataverse4Teams-Tools as a template so we can get up and running quickly.\nWhen we use this template we are prompted for the location and name for your version of the repository, I have made mine public [here](itweedie/Power-Platform-Dataverse4Teams-Demo (github.com)), but you may wish to use private.\nOnce created your version of the repository from the template, copy a link to it using the button just above the file list.\nHowever you could also import to an existing solution as a submodule using this command.\ngit submodule add https://github.com/mightora/Power-Platform-Dataverse4Teams-Tools.git scripts/dataverse4Teams After doing this you will need to initialize and update the submodule running this command:\ngit submodule update --init --recursive admonition Please also note that you will have to update your script paths in the examples we give below. admonition\nStep 2 - Clone the repository in to Visual Studio Code For our example we are going to clone the repository in to Visual Studio Code, using the link from Step 1.\nTo do this at the top of your Visual Studio Code window enter \u0026gt; Clone and Git: Clone should appear.\nStep 3 - Get setup You will then be prompted to install the extensions that are useful when using this repository.\nadmonition If you are not then navigate to File \u0026gt; Preferences \u0026gt; Settings and search for Extension Ignore Recommendations or use this link and make sure it is not ticked. admonition\nStep 4 - Get Environment ID for source We are going to need our Environment ID to use this script, to get this, navigate to Environments | Power Platform admin center (microsoft.com) and locate your source environment.\nGo in to it and copy the Environment ID\nStep 5 - Get Environment ID for target We now need to do the same for our target Environment.\nStep 6 - Check the source in the maker portal You can do this by https://make.powerapps.com/environments/{ENVIRONMENT ID HERE}/home\nFor ease I am going to keep these in a file called note.md\nStep 7 - Check the target in the maker portal You can do this by https://make.powerapps.com/environments/{ENVIRONMENT ID HERE}/home Step 8 - Overview of what we are going to do A quick overview of what we are going to be doing.\ngraph LR; A[Start] --\u003e B[Export Solution] B --\u003e C[Unpack Solution] C --\u003e D[Review] D --\u003e E[Repack Solution] E --\u003e F[Import Solution] F --\u003e G[Publish Customizations] G --\u003e H[End] Step 9 - Lets run the script This script exports a solution from a source environment, generates a solution settings template, and unpacks the solution, including any Canvas Apps it contains.\nParameters:\n-solutionName: The name of the solution to export. -exportDirectory: Directory where the solution\u0026rsquo;s zip file will be exported. -sourceEnv: ID of the source environment from which to export the solution. -unpackDirectory: Directory where the solution will be unpacked and Canvas Apps will be processed. Pulling all of this together we are going to run the command:\n.\\pipelineScripts\\downloadFromSource.ps1 -solutionName \u0026#34;Dataverse4TeamsDemo\u0026#34; -exportDirectory \u0026#34;.\\demo\\dataverse4TeamsDemo\u0026#34; -sourceEnv \u0026#34;1838fca4-6258-e6b8-a710-60838df81aa3\u0026#34; -unpackDirectory \u0026#34;.\\demo\\dataverse4TeamsDemo\\unpacked\u0026#34; Please remember to replace the variables with your own, for more information please consult\nStep 10 - Explore our unpacked solution We can now explore our unpacked solution. This is where using GIT becomes very useful as you can see from the commits what the changes were on the solution between runs.\nStep 11 - Explore our unpacked solution Lets push out solution to the target environment.\nThis script re-packs a previously unpacked solution and imports it into a target environment. If an environment settings file is provided, the import will include those settings.\nParameters:\n-solutionName: Name of the solution to be processed. -unpackDirectory: Directory where the solution is unpacked. -environmentSettingsFile: (Optional) Path to the environment settings file. -targetEnvironment: Target environment to which the solution will be imported. -exportDirectory: Directory where the repacked solution will be exported. -Managed: Switch true or false to indicate whether the solution should be managed. Step 12 - Lets see if it worked Lets explore using the maker portal to see if it has worked.\nStep 13 - Lets push our code Finally lets push our code to our repository on GitHub so next time we run it we can see changes between runs. When you come to do this you may need to set your user in GIT.\nYou can set your user in GIT by running the below commands with your information.\ngit config --global user.email \u0026#34;you@example.com\u0026#34; git config --global user.name \u0026#34;Your Name\u0026#34; Concludion By following these steps, you can establish a straightforward and repeatable process for transferring Dataverse for Teams solutions between environments without relying on Azure DevOps. The PowerShell scripts simplify the task, of exporting and reimporting Dataverse for Teams solutions. Integration with Git streamlines tracking changes, enabling you to monitor differences between solution builds over time.\n","date":"October 22, 2024","hero":"/posts/effortlessly-move-dataverse-for-teams-solutions/featureImage.png","permalink":"https://techtweedie.github.io/posts/effortlessly-move-dataverse-for-teams-solutions/","summary":"","tags":["PowerShell","Pipeline"],"title":"Effortlessly Move Dataverse for Teams Solutions"},{"categories":["Power Platform","DevOps","Backup","Azure DevOps"],"contents":"What Would Happen If You Lost Your Dev Environment? Ever wondered what would happen if someone accidentally deleted your development site or overwrote a key flow? Rebuilding from memory isn\u0026rsquo;t just frustrating — it’s costly.\nThis post walks through the process I demonstrated during my session with the Power Platform Learner to Leader community, showing how to automatically back up your environment using Azure DevOps, without requiring advanced YAML knowledge.\n📅 Date: 3rd October 2024\n📍 Location: Virtual Power Platform User Group\n🕕 Time: 6:00 PM to 8:00 PM BST\n📺 Watch the full session here\nWhy Back Up? We’ve all seen it — shared dev environments, unmanaged solutions sitting idle, a delete button clicked in the wrong place, and poof — it’s gone.\nSetting up a DevOps pipeline gives you version control, rollback, and documentation without relying on memory or screenshots.\nWhat You’ll Need A Power Platform environment (Dataverse) Azure DevOps (free tier is enough!) App Registration with permissions to Dataverse Five free DevOps licenses included per tenant One-time parallelism request (12-hour approval) What the Pipeline Does Here\u0026rsquo;s what we built together, step-by-step:\n✅ Install Power Platform Build Tools 🔐 Create a service connection with your app registration 📦 Export your solution (managed \u0026amp; unmanaged) 🔍 Unpack the solution into a Git repository 📖 Optionally: Export reference data and download Power Pages site 📄 Use my open-source DevOps extension to: Document tables Visualise relationships with Mermaid.js Automatically commit changes to a Git repo ⏰ Schedule it to run multiple times per day This gives you historical snapshots — no more guesswork on \u0026ldquo;what changed and when.\u0026rdquo;\nWhat If You’re Using Dataverse for Teams? I’ve built a special DevOps step that uses device login for backing up Dataverse for Teams environments too. It can’t run on a schedule (due to authentication), but it’s still a lifesaver when exporting and documenting your Teams-based solutions.\nIntroducing: Mightora DevOps Extension This free tool helps you:\n📘 Auto-generate table documentation 🔗 Build relationship diagrams 🧠 Document Canvas Apps 🔄 Push backups to your Git repo 📅 Run backups on a schedule It’s open-source and available via the Visual Studio Marketplace.\nTry It Yourself I’ve published the full presentation to GitHub with links and steps you can follow. Scan the QR code from the session or head over to my GitHub repo to get started.\nNo more “what did I change last week?”\nNo more “I wish I had a backup.”\nStart today — your future self will thank you.\nQuestions or Need a Hand? I love seeing this used in real projects. If you\u0026rsquo;re stuck or want to suggest improvements, ping me on LinkedIn or drop a comment on the YouTube video.\nEvent Registration Link Here is the registration link for the Virtual Power Platform User Group event:\n➡️ Event Registration – Virtual Power Platform User Group\n","date":"October 3, 2024","hero":"/events/241003-vd365ppug/image-1.png","permalink":"https://techtweedie.github.io/events/241003-vd365ppug/","summary":"","tags":["Power Platform","DevOps","Backup","Azure DevOps"],"title":"At the Virtual Power Platform User Group Presenting - Avoid Costly Mistakes: Backing Up Your Power Platform Development Environment"},{"categories":["Azure"],"contents":"Introduction Want to Calculate Working Day in Power Automate, having trubble with the number of variations. That’s why I’ve put together a nifty little tool to help out the community and make this process a whole lot easier. In this post, I\u0026rsquo;ll guide you step-by-step through setting up and using this tool. Whether you’re trying to figure out the next working day, handle complex scheduling, or just want a smoother way to automate your date calculations, this is for you. Let’s dive in.\nYou can watch the accompanying YouTube video here About the connector The Calcualte Working Day connector is free to use for everyone, has been through the Certified Connector process, and has being created as a contrubution to the Power Platform Community.\nFind out more Microsoft Learn documentation here You can access more detailed documentation on the connector here. admonition Quick Start:\nAdd the step in to your Power Automate Flow If asked for API key please enter free and if asked for endpoint select RapidAPI. Pass in your date values. admonition Steps to use the connector Below are a number of steps and examples covering how to use the Calculate Working Day connector.\nStep 1 - Create a Power Automate Flow We are going to create a Power Automate Flow with a manual trigger. We are also going to add a date field to that trigger. Step 2 - Lets add the connector Then we are going to search for the connector Calculate Working Day in the connector list. Step 3 - Enter in the API key We are going to enter in the API key, currently you can just enter in free for the key. However we may need to change this in the future to put some throttling in place but we will always be FREE. Step 4 - Configure Basic working Day Lets configure our first connector, Basic Next Working Day, this connector will tell you when the next working day is, assuming working days are Monday to Friday. Step 5 - Test it Lets give that a test to see if it works. Step 6 - Date In X Working Days Lets complicate matters slightly, lets calculate the working day in 4 working days time, assuming that Monday, Tuesday, Wednesday, Thursday, Friday are all working days. This is symbolised by 1,2,3,4,5 in the Working Days input box. Step 7 - Test Date In X Working Days Now lets give that a test, to see if it works. Step 8 - Lets complicate it a little Lets say we want to know the working day in 4 working days time, however only Monday and Wednesday are working days.\nWe configure this by entering 1,3 in to the Working Days input box.\nAs we can see it calculate the working day to be 7th October. Step 9 - Lets use a Combination of all Calculated Working Day Endpoints Using the action Combination of all Calculated Working Day Endpoints, lets set it again, but this time using the Trigger Date field as the Input Date on the connector.\nWe also want to set it so that Working Days are only a Monday and Wednesday.\nWe also want to add in a filter for Scottish bank holidays. Step 9 - Lets test Combination action Ok, lets give that a test, we can see it has returned some useful data for us. The connectors has told us based on what we input that;\nThe Next working day is 2024-12-30 The Working day in x days is 2025-01-13 Because this is a combination of all our end points it has given us some other information as well;\nThe First Working day of month is 2024-12-04 The Last Working day of month is 2024-12-30 Step 10 - Lets add in some non-working days Lets take this one step further, lets add in some Non working days as 2024-12-23,2024-12-24,2024-12-27,2024-12-30,2024-12-31 as maybe your business is closed over this period. We are also going to change our Input Date to be 2024-12-20.\nWith this new information what has the connector told us;\nIs input date a working day No Next working day 2025-01-06 Working day in x days 2025-01-15 First Working day of month 2024-12-04 Last working day of month 2024-12-18 Currently this end point only supports UK Bank Holidays, there is an update coming which will allow you to select none for bank holidays.\nHowever we can enter a blank custom value to not filter out any bank holidays.\nStep 10 - Combined with no bank holidays Changing the settings of our connector to work with no bank holidays, lets use the same settings as above but this time changing the Filter bank holidays for Country to no value.\nWe are also going to keep our Input Date to be 2024-12-20.\nWith this new information what has the connector told us;\nIs input date a working day No Next working day 2024-12-25 Working day in x days 2025-01-08 First Working day of month 2024-12-02 Last working day of month 2024-12-25 Step 11 - Is today a working day Lets take a look at another connector, Is Today A Working Day, this one tells you if today is a working day or not based on the Working Days entered.\nTrying this connector out, we enter that our only working day is a Friday by using 5.\nAs you can see we are testing this on a Monday and therefore it tells us that today is not one of the working days. Step 12 - First and Last working days of a month We also have an action called First And Last Working Day Of Month, this action will tell you the first and the last working days of the month.\nIn this example we have entered in;\nWorking Days as being Monday, Tuesday, Wednesday by entering 1,2,3 In the response we can see that based on the date provided it has calculated the first and last working day of the month to be;\nFirst Working day as 2024-09-02 Last working day as 2024-09-30 Conclusion With the various connectors outlined in this guide, you now have a powerful toolkit to handle a wide range of working day calculations within Power Automate. Whether you need to find the next working day, calculate a future date, or exclude specific holidays and non-working days, these connectors make the process seamless and adaptable to your specific needs.\nThis tool is designed to be flexible and user-friendly, allowing you to quickly and accurately perform complex date calculations with ease. And while this guide focused on basic scenarios, the connectors are capable of handling even more intricate requirements as you explore their full potential.\nAs always, I\u0026rsquo;m keen to hear your feedback and see how you implement these solutions. If you have any questions or suggestions, feel free to reach out—let\u0026rsquo;s continue to make Power Automate workflows even more powerful together!\n","date":"September 24, 2024","hero":"/posts/240924-calculate-working-day/featureImage.png","permalink":"https://techtweedie.github.io/posts/240924-calculate-working-day/","summary":"","tags":["Power Automate"],"title":"Calculate Working Day"},{"categories":["How to"],"contents":"Introduction Ever wondered what would happen if you lost your development environment? How do you feel about having to do all of that work again? What about if you spot a problem days later? Can you remember the value you changed three days ago in that flow? Have you ever seen someone delete the website record by mistake.\nThese are all situations I have been in over the last few years, and having a Pipeline in place to ensure backup of your hard work is extremely important. That historical record can save you many hours.\nToday am going to take you through how to create a DevOps Pipeline to backup your environment.\nYou can watch the accompanying YouTube video here Prerequisites for Setting Up a Pipeline in Azure DevOps to Export a Solution from Dataverse Azure DevOps Organization:\nEnsure you have an Azure DevOps organization. If not, you can create one for free here. Azure Repos:\nA repository where your pipeline code will reside. Dataverse Environment:\nEnsure you have access to the Dataverse environment from which you want to export the solution. Service Principal:\nCreate a service principal in Azure Active Directory (AAD) with the necessary permissions to access Dataverse. Power Platform Build Tools:\nInstall the Power Platform Build Tools extension in Azure DevOps. This extension provides tasks to automate common build and deployment activities related to Power Platform. Create our backup pipeline Step 1: Navigate to the Pipelines Screen From within our Devops Project screen, we need to navigate to Pipelines \u0026gt; Pipelines Step 2: Create first Pipeline Then we are going to create our first pipeline. To do this click on create Pipeline, then choose Azure Repos, and then choose starter pipeline. Once we have done that we will then run our pipeline. Step 3: Pipeline results After running our pipeline for the first time we can see we have an issue, for our DevOps organisation, in this case TechTweedie we don\u0026rsquo;t have parallelism. If you don\u0026rsquo;t see this error and your pipeline runs, then please skip to step X Step 4: Request Parallelism To enable this we need to fill out this form https://aka.ms/azpipelines-parallelism-request Step 5: After a little while approval comes through An email comes through outlining that parallelism in Azure DevOps has been enabled. Step 6: Lets try again Trying again we can now see our pipeline ran successfully. We now have a working Pipeline. Step 7: Lets edit our pipeline We now need to edit our pipeline, and start adding the actions to export our components from dataverse. Step 8 Next we are going to set the name and some variables to the below;\nname: $(TeamProject)_$(BuildDefinitionName)_$(SourceBranchName)_$(Date:yyyyMMdd)$(Rev:.r) variables: - name: varPowerPlatformSPN value: \u0026lt;YOUR SERVICE CONNECTION\u0026gt; - name: varSolutionName value: \u0026lt;YOUR SOLUTION NAME\u0026gt; - name: varWebsiteId value: \u0026lt;YOUR WEBSITE ID IN HERE\u0026gt; Step 9: Set a sechedule for our pipeline Then we are going to set a schedule for our pipeline to trigger on. In this example it will trigger on the hour, at midnight, 1pm, 6pm, Monday to Friday every week.\ntrigger: none schedules: - cron: 0 0,13,18 * * 1-5 displayName: Weekday Backup branches: include: - main always: true Step 10: Update our variables Next we need to update our variables, with value to link the pipeline to our environment.\nIn this example I am using the below values, make sure you change these for your pipeline.\nvariables: - name: varPowerPlatformSPN value: Dataverse - Backup - name: varSolutionName value: ModelDrivenAppHelpPage - name: varWebsiteId value: 98a716a8-c592-476c-bc5d-aefde10a8b5d Step 11: Change VM pool Next we are going to change the vm pool image over to be vmImage: 'windows-latest' Step 12: Check out our Repository We want to check out the repository that we want to update with the latest solution export from Power Platform.\n- checkout: self persistCredentials: true clean: true Step 13: Add build tools Delete what is already there and add in the Power Platform build tool step Step 14: Add Who am I and then Test Add in the Who Am I step so we can test the connection, and then don\u0026rsquo;t forget to change over the variable to make it easier to re-use the pipeline later on future projects. Step 15: Lets test it Lets test our work so far. Step 16: Set version number Now that it\u0026rsquo;s working we are going to set the solution version, we can do this by adding a task in, called Set Solution Version. Once we add this task we are going to update it with out variable for the Power Platform SPN, and Solution Name. Then we are going to set the Solution Version Number to be '1.0.0.$(Build.BuildID)'. This pulls in the Build ID ensuring that the solution number is always unique. Step 17: Export the solution as Managed Then we are going to export the solution, once again we are going to set the Power Platform SPN, Solution Name, we are also going to set the Solution Output File to $(Build.ArtifactStagingDirectory)\\$(varSolutionName)_managed.zip Step 18: Export the solution as Un-Managed Next we are going to copy the YAML from our last step and then change it slightly to get our unmanaged solution. Step 19: Unpack solution Then unpack the solution, this allows us to see what\u0026rsquo;s changed inside between runs. For this step we will set Solution Input File to $(Build.ArtifactStagingDirectory)\\$(varSolutionName).zip', and the Solution Target Folder to '$(Build.SourcesDirectory)\\src\\solutions\\$(varSolutionName)' Step 20: Commit it We are going to commit these changes to our repository.\necho commit all changes git config user.email \u0026#34;$(Build.RequestedForEmail)\u0026#34; git config user.name \u0026#34;$(Build.RequestedFor)\u0026#34; git checkout -b main git add --all git commit -m \u0026#34;Latest solution changes.\u0026#34; echo push code to new repo git -c http.extraheader=\u0026#34;AUTHORIZATION: bearer $(System.AccessToken)\u0026#34; push origin main Step 21: Lets check if it worked Next lets check if it worked. Oh no let\u0026rsquo;s see if we can fix it. Let\u0026rsquo;s test it again. Step 22: Take a look at the Repo Looking at the repo Pipeline Script in Full Here is the YAML pipeline script in full\nname: $(TeamProject)_$(BuildDefinitionName)_$(SourceBranchName)_$(Date:yyyyMMdd)$(Rev:.r) variables: - name: varPowerPlatformSPN value: Dataverse - Backup - name: varSolutionName value: ModelDrivenAppHelpPage - name: varWebsiteId value: 98a716a8-c592-476c-bc5d-aefde10a8b5d trigger: none schedules: - cron: 0 0,13,18 * * 1-5 displayName: Weekday Backup branches: include: - main always: true pool: vmImage: \u0026#39;windows-latest\u0026#39; steps: - checkout: self persistCredentials: true clean: true - task: PowerPlatformToolInstaller@2 inputs: DefaultVersion: true AddToolsToPath: true - task: PowerPlatformWhoAmi@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; - task: PowerPlatformSetSolutionVersion@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; SolutionName: \u0026#39;$(varSolutionName)\u0026#39; SolutionVersionNumber: \u0026#39;1.0.0.$(Build.BuildID)\u0026#39; - task: PowerPlatformExportSolution@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; SolutionName: \u0026#39;$(varSolutionName)\u0026#39; SolutionOutputFile: \u0026#39;$(Build.ArtifactStagingDirectory)\\$(varSolutionName)_managed.zip\u0026#39; Managed: true AsyncOperation: true MaxAsyncWaitTime: \u0026#39;60\u0026#39; - task: PowerPlatformExportSolution@2 inputs: authenticationType: \u0026#39;PowerPlatformSPN\u0026#39; PowerPlatformSPN: \u0026#39;$(varPowerPlatformSPN)\u0026#39; SolutionName: \u0026#39;$(varSolutionName)\u0026#39; SolutionOutputFile: \u0026#39;$(Build.ArtifactStagingDirectory)\\$(varSolutionName).zip\u0026#39; Managed: false AsyncOperation: true MaxAsyncWaitTime: \u0026#39;60\u0026#39; - task: PowerPlatformUnpackSolution@2 inputs: SolutionInputFile: \u0026#39;$(Build.ArtifactStagingDirectory)\\$(varSolutionName).zip\u0026#39; SolutionTargetFolder: \u0026#39;$(Build.SourcesDirectory)\\src\\solutions\\$(varSolutionName)\u0026#39; SolutionType: \u0026#39;Both\u0026#39; - task: CmdLine@2 inputs: script: | echo commit all changes git config user.email \u0026#34;$(Build.RequestedForEmail)\u0026#34; git config user.name \u0026#34;$(Build.RequestedFor)\u0026#34; git checkout -b main git add --all git commit -m \u0026#34;Latest solution changes.\u0026#34; echo push code to new repo git -c http.extraheader=\u0026#34;AUTHORIZATION: bearer $(System.AccessToken)\u0026#34; push origin main Conclusion \u0026amp; Next vlog Implementing a DevOps pipeline for backing up your Dataverse development environment is an essential practice that safeguards your work against unexpected issues and errors. By automating this process, you ensure that every change is recorded, making it easier to track modifications and recover from mistakes quickly. This pipeline provides a robust safety net, helping you avoid the frustration of lost progress and ensuring that your development efforts are always protected.\nIn the next post, I will delve deeper into this pipeline by adding advanced features, such as integrating Power Pages Portal, exploring and tracking changes in XML files, utilizing environment variables more effectively, and automatically generating wiki articles based on your solution components. These enhancements will further streamline your DevOps process and provide even greater control and visibility over your development environment.\nIf you have any questions or need more details on any of the prerequisite steps mentioned, feel free to leave a comment on the YouTube video. I am happy to create additional content to address those topics and help you get the most out of your DevOps setup.\n","date":"September 3, 2024","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/posts/backup-dataverse-development-environment/","summary":"","tags":["Power Platform DevOps","DevOps","Backup","Power Platform"],"title":"Backup Dataverse Development Environment"},{"categories":["Power Platform"],"contents":"Introduction Power Automate is a wonderful tool, but there is a world of difference between building something for personal productivity and utilising this amazing low-code tool for production, where an important business application will rely on it.\nWhen considering a workload like this, we need to make it easier to manage and troubleshoot, ensure reliability, and prioritise security. Finally, if the worst were to happen, such as some credentials becoming leaked or an endpoint being compromised, how can we configure this so that damage is limited?\nWhat can we do There is no perfect cookie-size answer that fits all, and it would be impossible to try to conceive every situation that will occur. It’s also important to exercise judgment when drafting the guidelines for your project and ensure that each one has a purpose and reduces perceived risk.\nSuggestions These are some of my suggested guidelines, based on my previous experience;\nPower Automate Flows Ownership Consider utilising a service account to own and manage Power Automate flows; this service account should have no permissions to the data workloads and should just purely be there to manage and own the Power Automate Flows.\nIn development environments In development environments, create connections and link them to connection references using a service account, not from a random developer’s single-user account. Then, share them with the rest of the development team.\nThe benefit of doing this is that your developers do not need to handle the credentials, reducing the risks of leaks. Additionally, your development environment remains clean, and it prevents every developer from having their own set of connection references to the same place. When it comes to deploying through a pipeline, this also speeds up the process of mapping connection references in your settings deployment file.\nUse solutions Although connections themselves are not solution aware, connection references are, and should always be created inside of a solution.\nUse Application Users for Dataverse When interacting with Dataverse from Power Automate Flow, utilise application registration, and application users. You can then permission these application users appropriately for the actions they should be performing, utilising the principle of least privileged.\nUse Application Users in Dataverse to help with Troubleshooting It can be really tough to figure out which automation caused a value to change, especially when all the Power Automate flows are running under the same user account. It\u0026rsquo;s even harder if that user is one of your everyday users.\nTherefore, consider creating different application users for various types of activities.\nThe usernames will then appear on the records within Dataverse under \u0026ldquo;created by\u0026rdquo;, \u0026ldquo;modified by\u0026rdquo;, or in the audit history if it\u0026rsquo;s switched on.\nUse App Registrations whenever possible Use application registrations instead of a service account username and password for a specific service when possible. This helps prevent access issues when the multi-factor authentication (MFA) needs to be renewed or when all refresh tokens are cancelled due to policy changes. Additionally, it can save on licensing costs. This approach is especially helpful when accessing Entra, SharePoint, and Exchange.\nBalanced approach and Damage limitation We need to acknowledge the potential for a breach and prepare to minimise the damage. However, our response should be practical and take into account the level of risk. It\u0026rsquo;s important to strike a balance between creating numerous separate \u0026ldquo;users\u0026rdquo; (application registrations/service accounts, etc.) for every task, which would be difficult to manage, and relying on a single highly privileged account.\nDon’t - set up one service account and let it run all of your power automate flows, and have permissions to all of the workloads that your flows are interactive with.\nDon’t - create a different application user/app registration for each connection on each flow unless you absolutely have to. It will become a management and maintenance nightmare.\nDo - have a balanced approach, split up the accounts that are interacting with the workload and the account that is used to manage\nFinal thoughts When considering your approach think carefully about what the risk really is, what would be the impact, and find the balance. In this some useful questions to ask are;\nAssume breach - How would you design the connection structure to limit damage if one were to be breached?\nLeast privilege - What is the minimum practical level of security you could grant to each connection?\nEasy to understand - How can you make it as easy as possible to understand for the next person? What could you do to help them troubleshoot future problems?\nContinual Improvement It\u0026rsquo;s important to keep in mind that the digital landscape, along with digital tools and security threats, is constantly changing. It\u0026rsquo;s crucial to stay updated on the latest best practices, updates, and potential vulnerabilities. Regularly review and improve your processes, involve your team in discussions as many brains are better than one. By fostering a culture of continuous improvement and adaptation, you can better safeguard your business applications and ensure that your Power Automate flows remain robust, secure, and efficient.\n","date":"August 5, 2024","hero":"/posts/securing-power-automate-for-production/featureImage.png","permalink":"https://techtweedie.github.io/posts/securing-power-automate-for-production/","summary":"","tags":["Power Automate","Power Platform Security"],"title":"Securing Power Automate for Production"},{"categories":["Power Platform"],"contents":"Introduction Recently, I was asked if it was possible to place a message at the top of every form and view across an entire Model-Driven App. This message would then open a Custom Page that would display more detail. In this Blog post, I explain how this can be achieved and what steps need to be taken.\nFor the purposes of this demo, we will call a custom help page from a command button.\nSteps Step 1: Create a Solution Lets Create a new solution for this to sit within Step 2: Custom Page From here we now need to build a custom page Step 3: Our Model Driven App So we can demonstrate this, we need to create a Model Driven App. Step 4: Custom commands Lets save and Publish this Model Driven App, and then we can view where we are going to place our custom commands, for this demo they will call a Custom Help Page Step 4: Custom page name Next we need to grab the name of our power page, not display name. In our demo this is techtwed_mdacustomhelppage_93a75. Step 5: JavaScript To create this we are going to use some JavaScript, where is some sample JavaScript we can adapt in order to create our button.\nfunction openCustomPage() { var pageInput = { pageType: \u0026#34;custom\u0026#34;, name: \u0026#34;new_custompage\u0026#34;, // The name of your custom page entityName: \u0026#34;account\u0026#34;, // The entity you want to open the custom page for, if applicable recordId: \u0026#34;00000000-0000-0000-0000-000000000000\u0026#34; // Optional, the ID of a specific record }; var navigationOptions = { target: 2, // Opens the page in a dialog width: { value: 80, unit: \u0026#34;%\u0026#34; }, height: { value: 80, unit: \u0026#34;%\u0026#34; }, position: 1, // Center title: \u0026#34;Custom Help Page\u0026#34; }; Xrm.Navigation.navigateTo(pageInput, navigationOptions).then( function success() { console.log(\u0026#34;Navigation successful\u0026#34;); }, function error(error) { console.log(\u0026#34;Navigation failed\u0026#34;); console.error(error); } ); } Lets create our JavaScript in VSCode called openCustomHelpPage.js. After dropping it in we need to make soem small changes to it. We need to remove\nentityName: \u0026#34;account\u0026#34;, // The entity you want to open the custom page for, if applicable recordId: \u0026#34;00000000-0000-0000-0000-000000000000\u0026#34; // Optional, the ID of a specific record as we dont need it, and update title with something appropriate, and name with the value from step 4. Finally, we need to update the name of our function. Step 6: Create a Custom Command Next, we need to create our Custom Command. For this example, we will do it in the Accounts table. Navigate to your solution \u0026gt; Tables \u0026gt; Table Name \u0026gt; Commands.\nThen Select Main grid and add a new button. Then, we need to upload our JavaScript. Finally lets configure our button to call our JavaScript function openCustomHelpPage.\nLets see if its worked Lets now add the same page to the form Lets test to see if that has worked Conclusion By following these steps, you can add a message at the top of every form and view across your Model Driven App that opens a custom page. This approach enhances user experience by providing detailed help or information seamlessly integrated within your app.\nFor more tips and detailed guides, stay tuned to my blog. If you have any questions or need further assistance, feel free to reach out via my website.\n","date":"June 23, 2024","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/posts/command-bar-to-call-custom-page/","summary":"","tags":["Command Bar","Custom Page","Model Driven App","JavaScript"],"title":"Command Bar to Call Custom Page in Model Driven App"},{"categories":["Power Platform"],"contents":"Introduction Want to get started using the Power Apps Component Framework (PCF), would you like to start building controls. In this blog post we are going to step through the process to building your first component.\nYou can watch the accompanying YouTube video here What languages To build out our component, we will be using both TypeScript and HTML. Don\u0026rsquo;t worry if you haven\u0026rsquo;t worked with these before.\nTools The tools we will be using today will be;\nVisual Studio Code Power Platform Tools Extension NodeJS LTS Why PCFs We use the PowerApps Component Framework to extend our apps, and to achieve function that is not available out of the box. Out low code canvas can\u0026rsquo;t always achieve what our clients would like and this is when we look to towards this kind of functionality.\nOptions for building controls We have two frameworks we can utilise, alongside two different templates.\nFrameworks - Visual function Standard Controls - These don\u0026rsquo;t have REACT out of the box. React (Virtual) Controls - You get to take advantage of REACT that is already been used in the Dataverse, which reduces the size of your controls. Control Templates - Placement of control Dataset - These can be used instead of view and sub grids Field - These are bound to a field and are limited to being used on a form Building our first PCF control Step 1: Authenticate We can authenticate against dataverse with the following command in PowerShell within Visual Studio Code.\npac auth create -env yourenv.crm11.dynamics.com pac auth list Step 2: Create a working folder Lets make ourselves a working folder\nmkdir control11 cd control11 Step 3: Initialise a blank component Lets just make sure we have Power Platform Tools Installed, and lets remind our-self what the parameters are we need to pass.\nOpen up the Terminal and enter\npac pcf init help Back will come the following\nMicrosoft PowerPlatform CLI Version: 1.31.6+g9147a23 Online documentation: https://aka.ms/PowerPlatformCLI Feedback, Suggestions, Issues: https://github.com/microsoft/powerplatform-build-tools/discussions Help: Initializes a directory with a new Power Apps component framework project Commands: Usage: pac pcf init [--namespace] [--name] [--template] [--framework] [--outputDirectory] [--run-npm-install] --namespace The namespace for the component. (alias: -ns) --name The name for the component. (alias: -n) --template Choose a template for the component. (alias: -t) Values: field, dataset --framework The rendering framework for control. The default value is \u0026#39;none\u0026#39;, which means HTML. (alias: -fw) Values: none, react --outputDirectory Output directory (alias: -o) --run-npm-install Auto run \u0026#39;npm install\u0026#39; after the control is created. The default value is \u0026#39;false\u0026#39;. (alias: -npm) For our first PCF we are building something simple, building a field template with no framework.\npac pcf init -n TechTweedieControl11 -ns TechTweedieControl11NS -t field Then run npm install, which is going to bring everything down\nnpm install You may get some warnings, don\u0026rsquo;t worry too much about them.\nStep 4: Lets go through the code When we explore some of our files.\nindex.ts file We find an index.ts file, inside our file we will find;\ninit public init(context: ComponentFramework.Context\u0026lt;IInputs\u0026gt;, notifyOutputChanged: () =\u0026gt; void, state: ComponentFramework.Dictionary, container:HTMLDivElement): void { // Add control initialization code } This is our entry point to our component, this gets called in the lifecycle when our control is loaded.\nupdateView This is called whenever any value in the property bag has changed.\npublic updateView(context: ComponentFramework.Context\u0026lt;IInputs\u0026gt;): void { // Add code to update control view } getOutputs This is called to pass data back to dataverse.\npublic getOutputs(): IOutputs { return {}; } destroy We can use this to remove any components from the DOM tree once we have finished with out component.\npublic destroy(): void { // Add code to cleanup control if necessary } ControlManifest.Input.xml file Tells us what the control is about, going through the elements.\nmanifest/control element This contains things like its display name and description.\nmanifest/property element The other things it\nStep 5: Lets run it We can run it to see what it looks like\nnpm start watch As this runs it will tell us if there are any errors, this allows you to make changes on the vscode side\nStep 6: Lets add some console logs Inside out index.ts file lets add two console.log(\u0026quot;text\u0026quot;); Lets see if we can access some data, lets console log out the context object.\nExploring the object lets see if we can access the value. Exploring the object we can see we have parameters \u0026gt; sampleProperty \u0026gt;raw so lets give that a go.\nLets change out console log to console.log(context.parameters.sampleProperty.raw);\nStep 7: Lets create our input This is where we need to create a property\nprivate inputElement: HTMLInputElement; We are then going to update our init so that it works from this new property\nthis.inputElement = document.createElement(\u0026#34;input\u0026#34;) as HTMLInputElement; this.inputElement.setAttribute(\u0026#34;type\u0026#34;, \u0026#34;text\u0026#34;); this.inputElement.setAttribute(\u0026#34;value\u0026#34;, context.parameters.sampleProperty.raw || \u0026#34;\u0026#34;); //Get value of property, and in the case of a null value set it to an empoty string container.appendChild(this.inputElement); Step 6: Get our values to update This looks good, but now if the value of our sample property is updated, the value does not update. In order to sort this, we need update our input value when the updateView runs. We need to enter this line under our updateView\nthis.inputElement.setAttribute(\u0026#34;value\u0026#34;, context.parameters.sampleProperty.raw || \u0026#34;\u0026#34;); However does it update the other way? In order to make this work we need to modify our TypeScript.\nFirst of all we need to edit our init, and add an event listener,\n// We are adding an event listner for when the input changes, and if that happens we are going to nofiy this.inputElement.addEventListener(\u0026#34;input\u0026#34;, (event) =\u0026gt;{ notifyOutputChanged() }); Then we go to the Get Outputs and update the return to be\npublic getOutputs(): IOutputs { return { sampleProperty: this.inputElement.value }; } Step 7: Lets get our control in to our model driven app With our control working let\u0026rsquo;s control it to Dataverse, and place it on a form.\nLets attach it to a control Conclusion Congratulations! You\u0026rsquo;ve successfully built and tested your first Power Apps Component Framework (PCF) control. This journey, from setting up the environment to deploying your control in a model-driven app, has provided you with a solid foundation for creating more complex and customized components. By leveraging TypeScript and HTML, along with powerful tools like Visual Studio Code and the Power Platform Tools Extension, you now have the skills to extend your applications beyond the standard functionalities.\nRemember, the true power of PCFs lies in their ability to enhance the user experience and meet specific business needs that out-of-the-box solutions may not cover.\nThank you for following along, and happy coding! If you have any questions or need further assistance, feel free to reach out or watch the accompanying YouTube video for more detailed explanations and visuals. Keep pushing the boundaries of what\u0026rsquo;s possible with Power Apps!\n","date":"June 16, 2024","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/posts/build-a-basic-pcf-field-control/","summary":"","tags":["PCF"],"title":"Build a Basic PCF Field Control"},{"categories":["Walkthrough"],"contents":"Introduction Hugo is a open-source static site generator, designed for speed and flexibility, making it an excellent choice when creating modern websites, blogs, and documentation. In this guide, we\u0026rsquo;ll walk you through the steps to install Hugo on your Windows 11 machine.\nYou can watch the accompanying YouTube video here Step 1: Download Hugo You can download Hugo using winget, Microsoft\u0026rsquo;s official free and open-source package manager for windows. For this, lets go to Visual Studio code, open up PowerShell and run the below command.\nwinget install Hugo.Hugo.Extended Then restart your machine\nStep 2: Check that Hugo is available hugo version If it is not available, then you will need to add it to your environment variables. To make Hugo accessible from the command line, you need to add it to your system’s PATH.\nOpen the Start Menu and type \u0026ldquo;Environment Variables.\u0026rdquo; Click on \u0026ldquo;Edit the system environment variables.\u0026rdquo; In the System Properties window, click on the \u0026ldquo;Environment Variables\u0026hellip;\u0026rdquo; button. In the Environment Variables window, find the \u0026ldquo;Path\u0026rdquo; variable under \u0026ldquo;System variables\u0026rdquo; and click \u0026ldquo;Edit.\u0026rdquo; Click \u0026ldquo;New\u0026rdquo; and enter the path to the directory where you extracted Hugo (e.g., C:\\path\\to\\hugo\\directory). Click \u0026ldquo;OK\u0026rdquo; to close all the windows. Step 3: Verify Installation To confirm that Hugo has been installed correctly, open Command Prompt and type:\nhugo version You should see a message displaying the version of Hugo that you installed.\nStep 4: Create Your First Site Now that Hugo is installed, you can create your first site.\na. Open Command Prompt. Open Command Prompt, navigate to the directory where you want to create your new site.\nb. Create your site Run the following command to create a new site:\nhugo new site mynewsite Then navigate into your new site directory:\ncd mynewsite c. Initialise Git repository Next we are going to initialise a git repository, this is good pratice and will be useful later.\ngit init d. Add a theme to your site. You can find many themes on the Hugo Themes website. For example, to add the PaperMod theme:\ngit submodule add --depth=1 https://github.com/adityatelange/hugo-PaperMod.git themes/PaperMod e. Then edit your config file Navigate to the hugo.toml file to add in your theme information by adding the following line\ntheme = \u0026#39;PaperMod\u0026#39; f. Add a post Finally create some new posts by running the following command\nhugo new content/posts/post_one.md hugo new content/posts/post_two.md Step 5: Update your content Inside your content folder, locate your posts, update them and set draft status to false.\nContent for Hugo Sample Post:\n# Welcome to Hugo! Hugo is a fast and flexible static site generator built with love by [bep](https://github.com/bep), [spf13](https://github.com/spf13), and [friends](https://github.com/gohugoio/hugo/graphs/contributors). Below is a demonstration of various features you can use in your Hugo site. ## Headings # Heading 1 ## Heading 2 ### Heading 3 #### Heading 4 ##### Heading 5 ###### Heading 6 ## Text Formatting **Bold text** *Italic text* ~~Strikethrough~~ \u0026gt; Blockquote Inline code: `var example = true` ## Links [Hugo\u0026#39;s official website](https://gohugo.io/) ## Lists ### Unordered List - Item 1 - Subitem 1 - Subitem 2 - Item 2 ### Ordered List 1. First item 2. Second item 1. Subitem 1 2. Subitem 2 ## Images ![Hugo Logo](https://gohugo.io/images/hugo-logo-wide.svg) ## Code Blocks ### JavaScript ```javascript console.log(\u0026#34;Hello, Hugo!\u0026#34;); ``` ### Python ```python def hello_hugo(): print(\u0026#34;Hello, Hugo!\u0026#34;) ``` ## Tables | Syntax | Description | |-----------|-------------| | Header | Title | | Paragraph | Text | ## Shortcodes ### Highlight // This is a Go code block package main import \u0026#34;fmt\u0026#34; func main() { fmt.Println(\u0026#34;Hello, Hugo!\u0026#34;) } ### Figure Hugo Logo Step 6: Build and Serve Your Site To see your site in action, you need to build and serve it.\nRun the following command to start the Hugo server: hugo server -D Open your web browser and go to http://localhost:1313. You should see your new Hugo site. Conclusion Congratulations! You\u0026rsquo;ve successfully installed Hugo on your Windows 11 machine and created your first static site. Hugo\u0026rsquo;s speed and flexibility make it an excellent choice for building a wide range of websites, from simple blogs to complex company sites.\nIf you have any questions or run into any issues, the Hugo community is very active and can be a great resource. Happy building!\n","date":"June 4, 2024","hero":"/posts/how-to-install-hugo-on-windows-11/featureImage.png","permalink":"https://techtweedie.github.io/posts/how-to-install-hugo-on-windows-11/","summary":"","tags":["Hugo"],"title":"How to Install Hugo on Windows 11"},{"categories":["Power Platform"],"contents":"Introduction Frequently moving between desktop builds, want to be able to get up and running quickly, I’ve found it incredibly useful to have my own build script that covers 80% of the tools I am likely to need. It\u0026rsquo;s essential tools ready to go saves a lot of time and hassle, that’s why I created a PowerShell script to automate the installation of the developer tools I often use when working on the Power Platform.\nIn this blog post, I’ll share my PowerShell script, explain how it works, and show you how you can use it to streamline your own setup process.\nWhy Automate Your Setup? Manually installing and configuring development tools can be a tedious process, especially if you need to do it frequently. By automating this process, you can:\nSave time and effort Ensure consistency across different setups Quickly get back to productive work Introducing the Power Platform Developer Tools Install Script The script I developed leverages winget to install a variety of essential software packages. It also handles the setup of PowerShell modules, configuration of Git, installation of VSCode extensions, and more. Here’s a rundown of what the script does:\nInstalls development tools such as Visual Studio Code, Power BI Desktop, SQL Server Management Studio, Windows Terminal, Notepad++, PowerToys, Postman, Visual Studio 2022 Professional, and Node Package Manager (NPM). Installs PowerShell modules for PowerApps Administration. Sets up Git and configures it with your Windows credentials. Installs Azure Storage Explorer. Downloads the latest release of XrmToolBox, extracts it to your Desktop, and organizes it in a specified folder. Installs useful VSCode extensions for Azure and Power Platform development. How to Use the Script Using the script is straightforward. Here are the steps to get your development environment set up quickly:\nOpen PowerShell with Administrative Privileges\nMake sure you run PowerShell as an administrator to ensure all tools and configurations can be applied. Run the Script\nCopy and paste the following command into your PowerShell window. This command will download the script from my GitHub repository and execute it. # Ensure the Desktop directory exists $desktopPath = [System.Environment]::GetFolderPath(\u0026#39;Desktop\u0026#39;) if (-Not (Test-Path -Path $desktopPath)) { New-Item -ItemType Directory -Path $desktopPath | Out-Null } # Download and run the script from GitHub Invoke-WebRequest -Uri \u0026#34;https://raw.githubusercontent.com/itweedie/Power-Platform-Developer-Tools-Install-Script/main/install-power-platform-dev-tools.ps1\u0026#34; -OutFile \u0026#34;$desktopPath\\install-power-platform-dev-tools.ps1\u0026#34; PowerShell -ExecutionPolicy Bypass -File \u0026#34;$desktopPath\\install-power-platform-dev-tools.ps1\u0026#34; Follow the Prompts\nThe script will guide you through the installation process, ensuring all necessary tools are set up correctly. Behind the Scenes: How the Script Works The script is designed to handle various installation tasks and configurations efficiently. Here’s a brief overview of its key components:\nWinget Commands: The script uses winget to install several essential tools, capturing any errors encountered during the process. PowerShell Modules: It installs important PowerShell modules for PowerApps development. Git Configuration: The script configures Git with your Windows credentials, ensuring a smooth version control setup. VSCode Extensions: Several VSCode extensions are installed to enhance your development experience with Azure and Power Platform. Node Package Manager (NPM): Installs NPM for managing JavaScript packages, a crucial tool for many development projects. The full script You can view the full script here, feel free to contribute\nPower-Platform-Developer-Tools-Install-Script/install-power-platform-dev-tools.ps1 at main · itweedie/Power-Platform-Developer-Tools-Install-Script (github.com)\nConclusion Automating the setup of your development environment can significantly improve your productivity and ensure consistency across different machines. By using my PowerShell script, you can quickly get all your essential tools installed and configured, allowing you to focus on what you do best—developing solutions on the Power Platform.\nFeel free to check out the Power Platform Developer Tools Install Script on GitHub. Contributions and feedback are welcome!\nHappy coding!\nRevisions 2024-07-03: Posted on https://techtweedie.github.io, my new self hosted blog Originally posted on May 24, 2024 my blog https://helpmewithmy.technology, a platform I am currently migrating away from. ","date":"June 4, 2024","hero":"/posts/power-platform-developer-tools-install-script/featureImage.jpg","permalink":"https://techtweedie.github.io/posts/power-platform-developer-tools-install-script/","summary":"","tags":["Developer Tools","Install Script","Automation","PowerShell"],"title":"Power Platform Developer Tools Install Script"},{"categories":["Power Platform"],"contents":"Introduction Have you ever wondered if you can use Environment Variables in a PowerFX button, how to create a custom email link button in the command bar of your Power Apps model-driven application using environment variables and Power FX? Luckily, Power Apps has made it possible to achieve this. With the help of Power FX, which is the same language used in canvas app development, anyone can efficiently customise their app’s command bar without needing prior knowledge of JavaScript or the Ribbon Workbench.\nIn this blog post, we’ll guide you through the steps involved in customising the command bar of your Power Apps model-driven app. You’ll learn how to use environment variables to create a button that dynamically generates email links. This capability provides users with greater control over their app interfaces and streamlines their interactions, making operations within the app more effective and tailored to specific needs.\nAssumptions You are in a Dataverse environment. You are in a solution. You have a model-driven app already in that solution. You have a table you would like to email records on. Create environment variables Navigate to your solution, select All and then from the menu at the top, select New, then choose More \u0026gt; Environment variable.\nThen, fill in the details of our new environment variable. We will use these values later.\nFor my example, I created three environment variables, all relating to the email we will send.\nCommand Bar Customisation Navigate to your Model Driven App Within your solution, navigate to the model-driven app, and open it up in edit mode.\nNavigate to the command bar editor Find the table you which to add the custom command to, and then click on the ellipsis (three dots), then choose Edit command bar.\nYou will then be asked which command bar you wish to edit, choose Main grid.\nCreate your new button Choose New, and then select Command\nThis will place an object called NewCommand on the command bar. You can drag this to any position you want.\nWith the command selected choose, you will see on the Right-Hand side of the screen a number of options available for you to select.\nFor this demonstration, I am using the following values\nLabel: Email Record Icon: Use Icon, Email Action: Run formula (at this point, if you have not already you will be asked to create a component library) Visibility: Show on condition form formula Tooltip title: Clone record(s) Tooltip description: This button clones all of the selected records Accessibility text: Press this button to clone selected records Enable Environment Variables in the Component Library Save and Publish your changes so far.\nThen we need to move to the component library.\nThen you will be presented with a screen vert similar to one used for canvas apps.\nFrom this screen you need to press on Add data and then add both Environment Variable Definitions and Environment Variable Values tables from Dataverse.\nConfigure our new button In the top left-hand corner, you will see a drop-down box; this allows you to select either Visible or OnSelect.\nWith Visible chosen, copy and paste the following code:\n// This is going to show the button as soon as an item is selected. !IsEmpty(Self.Selected.AllItems) This command is essentially verifying that there are items currently selected. If any items are selected, the result will be true, indicating that the list of selected items is not empty. Conversely, if no items are selected, the result will be false, signifying that the list is empty.\nWith OnSelect chosen, copy and paste the following code:\n// We are going to iterate over all items that have been selected in the Table. ForAll( Self.Selected.AllItems, Patch( Agreements, Defaults(Agreements), {Name: \u0026#34;CLONE - \u0026#34; \u0026amp; Text(Now(), \u0026#34;[$-en-US]yyyy-mm-dd hh:mm:ss\u0026#34;) \u0026amp; \u0026#34; - \u0026#34; \u0026amp; ThisRecord.Name} ) ); Notify(\u0026#34;The records selected have been cloned.\u0026#34;) Save, Publish, and Test\nNow that we have finished our component, press on Save and Publish\nThen navigate back to your Model Driven App, and Play the app\nOnce it opens, press Ctrl + F5 to refresh your browser. Then you should see the Clone Record button in the command bar at the top of the screen.\nWhen we press on the button we get the following result.\nFrequently Asked Questions 1. What permissions are needed to implement the custom PowerFX button?\nTo implement a custom PowerFX button in a model-driven app, you need to have the necessary permissions to edit the app within Microsoft PowerApps. Typically, this means you should be either an app maker or a system administrator within your environment.\n2. Can the cloning process handle related records?\nYes, the PowerFX script can be adjusted to handle related records depending on your specific requirements. You might need to modify the script to explicitly specify which related entities to clone along with the primary record.\n3. How can I ensure data consistency during the cloning process?\nTo ensure data consistency, consider implementing validation rules in your PowerFX script. This could include checks to prevent cloning of incomplete records or to ensure that all mandatory fields are populated before a record is cloned.\n4. Is there a limit to the number of records I can clone at one time?\nWhile PowerFX efficiently handles multiple records, performance may vary based on the complexity of the data and the number of fields being cloned. It’s advisable to test the process with different volumes of data to determine an optimal batch size for your use case.\n5. What happens if the cloning process fails?\nImplement error handling within your PowerFX script to manage failures. You can set up notifications that alert you to any issues during the cloning process, allowing for quick troubleshooting and resolution.\nConclusion Implementing a custom PowerFX button in model-driven apps can help you clone multiple records efficiently, thereby enhancing data management capabilities within your app. Although the initial setup requires some configuration and testing, the long-term benefits of accurately and swiftly replicating data are invaluable. Robust error handling and data validation mechanisms are essential to maintain data integrity and system reliability. With the correct setup, this solution can empower users to manage larger data sets more easily, ultimately boosting productivity and ensuring consistent data handling across your applications.\nWhere Else You Can Find This Post This post was originally posted on May 24, 2024 my blog https://helpmewithmy.technology, a platform I am currently migrating away from. ","date":"May 24, 2024","hero":"/posts/variables-in-powerfx-command-buttons/featureImage.jpg","permalink":"https://techtweedie.github.io/posts/variables-in-powerfx-command-buttons/","summary":"","tags":["Environment Variable","PowerFX","Improve User Experience","Model Driven Apps"],"title":"Variables in PowerFX Command Buttons"},{"categories":["Power Platform"],"contents":"Introduction Cloning multiple records within a data view in a Model-Driven App is a common request from end users. It helps them quickly fill out data fields in situations where only a few values differ between records.\nIn this blog post, we will explore how to add a custom button to the command bar in a Power Apps model-driven application to duplicate multiple records. Power Apps recently introduced this helpful feature, and the best part is that it can be achieved using Power FX, the same language used for developing canvas apps. This latest capability has simplified the model-driven app development process, enabling anyone to customise the command bar, even those who are not proficient in JavaScript or experienced with the Ribbon Workbench.\nThis blog post will guide you through customising the command bar in a Power Apps model-driven app. You will learn how to create a button to duplicate one or more records in a table. This new feature will give users more control over their data and enable them to perform actions more efficiently.\nCommand Bar Customisation Assumptions You are in a Dataverse environment You are in a solution You have a model-driven app already in that solution You have a table you would like to clone records on Navigate to your Model Driven App Within your solution, navigate to the model-driven app, and open it up in edit mode.\nNavigate to the command bar editor Find the table you which to add the custom command to, and then click on the ellipsis (three dots), then choose Edit command bar.\nYou will then be asked which command bar you wish to edit, choose Main grid.\nCreate your new button Choose New, and then select Command\nThis will place an object called NewCommand on the command bar. You can drag this to any position you want.\nWith the command selected choose, you will see on the Right-Hand side of the screen a number of options available for you to select.\nFor this demonstration, I am using the following values\nLabel: Clone Record Icon: Use Icon, Clone Action: Run formula (at this point, if you have not already you will be asked to create a component library) Visibility: Show on condition form formula Tooltip title: Clone record(s) Tooltip description: This button clones all of the selected records Accessibility text: Press this button to clone selected records Configure our new button In the top left-hand corner, you will see a drop-down box; this allows you to select either Visible or OnSelect.\nWith Visible chosen, copy and paste the following code:\n// This is going to show the button as soon as an item is selected. !IsEmpty(Self.Selected.AllItems) This command is essentially verifying that there are items currently selected. If any items are selected, the result will be true, indicating that the list of selected items is not empty. Conversely, if no items are selected, the result will be false, signifying that the list is empty.\nWith OnSelect chosen, copy and paste the following code:\n// We are going to iterate over all items that have been selected in the Table. ForAll( Self.Selected.AllItems, Patch( Agreements, Defaults(Agreements), {Name: \u0026#34;CLONE - \u0026#34; \u0026amp; Text(Now(), \u0026#34;[$-en-US]yyyy-mm-dd hh:mm:ss\u0026#34;) \u0026amp; \u0026#34; - \u0026#34; \u0026amp; ThisRecord.Name} ) ); Notify(\u0026#34;The records selected have been cloned.\u0026#34;) Save, Publish, and Test\nNow that we have finished our component, press on Save and Publish\nThen navigate back to your Model Driven App, and Play the app\nOnce it opens, press Ctrl + F5 to refresh your browser. Then you should see the Clone Record button in the command bar at the top of the screen.\nWhen we press on the button we get the following result.\nFrequently Asked Questions What is a Model-Driven App in Power Apps?\nA Model-Driven App in Power Apps is an application that is primarily driven by the underlying data model and business processes. It uses the metadata of your data and relationships to automatically generate and manage the app\u0026rsquo;s user interface.\nWhat is Power FX?\nPower FX is the formula language used in Power Apps, similar to Excel formulas. It is used to define logic and operations within the apps, making it easier for users without extensive programming backgrounds to customize applications.\nHow can I add a custom button to the command bar in a Model-Driven App?\nTo add a custom button to the command bar, navigate to your Model-Driven App in edit mode, access the command bar editor, and select the table you wish to customise. Choose to add a new command, configure it with your desired settings, and save your changes.\nWhat does the \u0026lsquo;Clone Record\u0026rsquo; button do?\nThe \u0026lsquo;Clone Record\u0026rsquo; button duplicates the selected records in a table, allowing users to quickly replicate data with minor modifications. It uses a Power FX formula to handle the duplication process.\nHow do I ensure the \u0026lsquo;Clone Record\u0026rsquo; button is visible only when items are selected?\nTo control the visibility of the \u0026lsquo;Clone Record\u0026rsquo; button, use the Power FX expression !IsEmpty(Self.Selected.AllItems). This expression checks if any items are selected and shows the button accordingly.\nConclusion In this blog post, we explored the addition of a custom \u0026lsquo;Clone Record\u0026rsquo; button to the command bar in a Power Apps model-driven application. This functionality leverages the newly introduced capabilities of Power FX, allowing even those with limited coding expertise to enhance their apps effectively. By following the steps outlined, you can customize your command bar to better suit your operational needs and streamline data management tasks.\nThis capability not only boosts productivity by simplifying the duplication of records but also empowers users to manage their data more autonomously. The integration of such features is a testament to the evolving landscape of app development, where user-friendliness and customisation go hand in hand. Whether you\u0026rsquo;re a seasoned developer or a business user, these tools can help optimise your workflows and improve the overall efficiency of your operations.\nRemember to save, publish, and test your new button to ensure it functions as expected. By doing so, you\u0026rsquo;ll enhance the user experience within your Model-Driven App and take full advantage of the robust features Power Apps offers.\n","date":"April 22, 2024","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/posts/clone-with-powerfx-command-button/","summary":"","tags":["Command Bar","PowerFX","Improve User Experience","Model Driven Apps"],"title":"Clone with PowerFX Command button"},{"categories":["Troubleshooting"],"contents":"Streamlining Webhook Testing with Webhook.site I often find myself undertaking some form of prototyping or investigation. For instance lets say I am publishing an end point for a client to connect to via a HTTP call, often also referred to as a web hook.\nOne such issue this can introjuce is being able to see what is the actual request been made.\nI came across Webhook.site, which has proved particularly useful in testing and troubleshooting webhooks. This platform has been a great help, especially in dealing with hidden values, such as setting the value but not the key value for API key authentication. By providing a straightforward way to inspect webhook payloads in real-time, without the need for a dedicated backend server for testing.\nThe Challenge of Hidden Values\nDealing with webhooks can be quite a challenge, especially when the data you want to observe is not easily visible within the application that\u0026rsquo;s making the call. It can also be difficult when the receiving application is not responding in the way you expect it to, and you want to check what information it is actually receiving. Before discovering Webhook.site, diagnosing issues or ensuring correct transmission of information involved a cumbersome process of creating custom logging scripts or setting up proxy servers to capture the data.\nA Solution That Delivers\nWebhook.site offers a distinct URL that can be used to direct any webhook for immediate inspection. This straightforward process enables me to effortlessly access all the essential details of the requests, including headers, payloads, as well as any elusive values, via a user-friendly interface. It\u0026rsquo;s like having a magnifying glass that unveils the intricacies of webhook operations, making it easy to identify and resolve any issues quickly.\nTransforming Troubleshooting\nWebhook.site not only displays webhook data but also allows manipulating responses. This feature helps in testing different responses and troubleshoots thoroughly**.**\nTry it out at https://webhook.site/\n","date":"February 9, 2024","hero":"/posts/steamlining-webhook-testing/featureImage.png","permalink":"https://techtweedie.github.io/posts/steamlining-webhook-testing/","summary":"","tags":["Webhook Testing"],"title":"Streamlining Webhook Testing"},{"categories":["Power Platform"],"contents":"Introduction Want to deploy your Power Apps Solution files quickly via Pipelines? Not sure how to set your environment variable or connection references. This blog post will explain how we solved this problem using a Solution Configuration File.\nUnderstanding the Need for a Solution Settings File A solution settings file is crucial when your solution involves environment variables or connection references. Without it, you might end up with configurations that do not carry over the intended values, leading to solutions that don’t behave as expected in different environments.\nSteps Download your solution Download either a managed or unmanaged copy of your solution. Generate the Settings File Use the Power Apps CLI (Command Line Interface) to create your settings file. Execute the following command, adjusting the file names as necessary: pac solution create-settings --solution-zip .\\\\YourSolution_managed.zip --settings-file .\\\\YourSettings.json This command generates a YourSettings.json file, which might look something like this initially: { \u0026#34;EnvironmentVariables\u0026#34;: [ { \u0026#34;SchemaName\u0026#34;: \u0026#34;your_schema_name\u0026#34;, \u0026#34;Value\u0026#34;: \u0026#34;\u0026#34; } ], \u0026#34;ConnectionReferences\u0026#34;: [ { \u0026#34;LogicalName\u0026#34;: \u0026#34;your_logical_name\u0026#34;, \u0026#34;ConnectionId\u0026#34;: \u0026#34;\u0026#34;, \u0026#34;ConnectorId\u0026#34;: \u0026#34;/providers/Microsoft.PowerApps/apis/your_connector\u0026#34; } ] } Populate Environment Variable Values You can directly edit the YourSettings.json file to include the appropriate values. For environment variables, which are typically straightforward text values, it’s as simple as inserting the desired content: \u0026#34;EnvironmentVariables\u0026#34;: [ { \u0026#34;SchemaName\u0026#34;: \u0026#34;your_schema_name\u0026#34;, \u0026#34;Value\u0026#34;: \u0026#34;Your Custom Message\u0026#34; } ] Update Connection References\nTo fill in the connection references:\nNavigate to make.powerapps.com, select your target environment, and then go to Dataverse \u0026gt; Connections. Either create a new connection or use an existing one that matches your needs. Once your connection is set up, find its Connection ID in the URL and update your settings file accordingly. Make sure you share the connection with the service account that will be importing the connection. Your final ConnectionReferences section should resemble this:\n\u0026#34;ConnectionReferences\u0026#34;: [ { \u0026#34;LogicalName\u0026#34;: \u0026#34;your_logical_name\u0026#34;, \u0026#34;ConnectionId\u0026#34;: \u0026#34;your_connection_id\u0026#34;, \u0026#34;ConnectorId\u0026#34;: \u0026#34;/providers/Microsoft.PowerApps/apis/your_connector\u0026#34; } ] Import the Updated Solution With your settings file populated, import the solution back into Power Apps using the CLI or a Pipeline. This command updates your solution in the target environment with the specified environment variable and connection reference values. pac solution import --path .\\YourSolution_managed.zip --settings-file .\\YourSettings.json Conclusion By following these steps, you’ve successfully created and configured a solution settings file, tailoring your Power Platform solution to meet the specific needs of its deployment environment. This process not only ensures that your solutions are portable but also maintains consistency across different environments, making your Power Platform development more efficient and reliable.\n","date":"February 7, 2024","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/posts/choose-what-dataverse-search-indexes/","summary":"","tags":["Pipeline","Model Driven Apps"],"title":"Create a Solution Configuration File"},{"categories":["Digital Transformation"],"contents":"In the vast and evolving landscape of software development, the principles guiding our choices often transcend the technical. One psychological concept, Maslow’s Hammer, poignantly captures a cognitive bias that impacts much more than our personal lives—it resonates deeply within the realm of software architecture. Commonly paraphrased as, “If all you have is a hammer, everything looks like a nail,” this law, attributed to Abraham Maslow, underscores the tendency to over-rely on familiar tools or methods at the expense of potentially more effective solutions.\nThe Essence of Maslow’s Hammer At its core, Maslow’s Hammer, or the Law of the Instrument, critiques our inclination to default to the familiar. In the context of software architecture, this means favoring technologies, patterns, or approaches we know well, even when they might not be the most suitable for the task at hand. This cognitive bias can significantly influence the efficiency, scalability, and adaptability of software solutions.\nThe Impact on Software Architecture Software architecture is the blueprint for a system, defining its structure, components, and the interactions between them. The choices made during this phase can have profound implications on the final product. Here’s how Maslow’s Hammer manifests in this domain:\nTechnology Choice: Architects might lean towards platforms and technologies they’re comfortable with, which can lead to suboptimal or overly complex solutions. For instance, my preference for Microsoft technologies, particularly Power Platform and Azure, reflects a personal bias that could limit the exploration of alternative, potentially more fitting options. Design Patterns and Solutions: The tendency to apply familiar design patterns indiscriminately, dubbed “patternitis,” can unnecessarily complicate architecture. Similarly, reusing known solutions without fully considering unique problem aspects can result in inefficiencies or new challenges. Innovation and Adaptability: A narrow focus on known tools and methods can stifle innovation and reduce the architecture’s ability to adapt to future needs. The software industry is dynamic, with new technologies and paradigms emerging regularly that could offer superior solutions. Understanding User Needs: An overemphasis on technology can obscure the actual needs of end-users. Effective software architecture must focus on solving real problems efficiently, requiring an understanding of users’ needs and a willingness to explore diverse solutions. Navigating the Bias Acknowledging our biases towards certain technologies or methodologies is the first step in mitigating the impact of Maslow’s Hammer. Here are strategies to navigate this bias in software architecture:\nContinuous Learning: Stay abreast of emerging technologies, design patterns, and methodologies. Broadening your knowledge base enables you to select the most appropriate tools for a given problem. Diverse Teams: Collaborate with professionals who have varied expertise and preferences. This diversity can counteract individual biases and lead to more balanced and innovative solutions. User-Centered Design: Prioritize understanding the needs of the end-users. Solutions should be driven by user requirements, not the preferences or comfort zones of the development team. Experimentation and Prototyping: Test multiple approaches to find the best solution. Prototyping and experimentation can reveal the strengths and weaknesses of different options in a practical context. Reflective Practice: Regularly review and reflect on architectural decisions. Consider what biases might have influenced these choices and how outcomes were affected. Conclusion While we all have our preferred tools and technologies, it’s crucial to recognize how these preferences can shape our architectural decisions. Embracing a mindset of openness, curiosity, and user focus can help us overcome the limitations of Maslow’s Hammer, leading to software architectures that are not only innovative and efficient but also truly aligned with the needs they aim to serve. By challenging our predispositions and expanding our toolkits, we can ensure that our architectural blueprints lead to robust, scalable, and user-centric software solutions.\n","date":"February 3, 2024","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/posts/bias-of-maslows-hammer/","summary":"","tags":["Architecture"],"title":"Bias of Maslow’s Hammer"},{"categories":["Trubbleshooting"],"contents":"Introduction I recently set up a new WordPress website for a friend. It was late one evening, and I was in a bit of a rush; as with many people online, I chose to use Cloudflare to protect the website.\nHowever, problems arose after setting up the WordPress website when I configured the SSL/TLS side of things.\nI switched on Cloudflare’s proxy as normal.\nSo far so good.\nThen I continued to edit the site before handing it over to my friend; however, I noticed I was getting load errors.\nChecking some of the settings, I noticed that both the WordPress Address (URL) and Site Address (URL) were set to http://a.wordpress-website.com. I changed them to https://a.wordpress-website.com. To my annoyance, when I did, I was met with ERR_TOO_MANY_REDIRECTS.\nThis error occurs when a webpage gets stuck in a loop of continuous redirects, making it impossible to access the desired content.\nWhat had I done wrong? In this post, I run through the steps I took to understand what happened and how I fixed it.\nSome of the reasons why this problem can occur Something local: It could be because of some issue local to my device, something browser/client-based. Redirect loops: This is where https://a.wordpress-website.com points to https://b.wordpress-website.com, which then, in turn points the user back to https://a.wordpress-website.com A misconfiguration SSL/TLS Encryption mode: Your domain’s SSL/TLS Encryption mode controls how Cloudflare connects to your origin server and how SSL certificates presented by your origin will be validated. Various Edge Certificates settings: If you enable Always Use HTTPS for your domain, Cloudflare will automatically redirect all HTTP requests to HTTPS for every subdomain and host in your application. However, first, let’s take a step back so we can all understand what communication is going on here.\nIn our scenario, the browser is going to Cloudflare, and Cloudflare is then going to the Origin Server which is where the WordPress Website is.\nSo, how did I approach the problem? Something Local Before diving too far into solutions, it’s essential to understand if we have a real problem or if this results from some cache, DNS, or other browser thing. This can happen when bringing any site online.\nSo, I cleared all my cache, flushed my DNS, restarted, and nothing changed.\nNext on my list was, is it just me, how about if I use a different device, there might be something specific just to this client? So, I fired up Windows Sandbox (see my post on how to set this up here), but no luck; pick up my iPhone, still no luck, same problem.\nSo, it’s not just me.\nRedirect loops Next, I checked the WordPress settings to see if any redirect loops or plugins were causing the issue. After disabling everything, testing, and seeing if the problem was still there, I was convinced that I checked through the WordPress settings.\nEverything looked correct, so I thought, what if I change the setting back for both the WordPress Address (URL) and Site Address (URL). Sure, enough of the problem was resolved. However, my page load errors returned.\nI thought I would try switching off the Cloudflare proxy for this hostname and setting the site URL back to https. No error. Switched Cloudflare back on. The redirect loop came back.\nI had established that I could not have Cloudflare proxy on and have the site URLs set to HTTPS. This didn’t seem right; I had done it this way before, but what was different now?\nDiving into the problem I knew where the problem was.\nOne of the settings in Cloudflare was causing Cloudflare to make an HTTP request to the origin server. The origin server was redirecting this request to HTTPS. As you can see step 2 does not make a HTTPS request to the origin.\nI needed to find the setting Cloudflare that handles the connection between Cloudflare and the Origin server.\nThe solution Settings to control how Cloudflare connects to the origin server are stored under the dashboard for your domain, under SSL/TLS. See this useful article. Reading through the options, I see that for one of the options;\nFlexible: Setting your encryption mode to Flexible makes your site partially secure. Cloudflare allows HTTPS connections between your visitor and Cloudflare, but all connections between Cloudflare and your origin are made through HTTP. As a result, an SSL certificate is not required on your origin.\nFrom: https://developers.cloudflare.com/ssl/origin-configuration/ssl-modes/\nMy site was configured in flexible mode, an easy fix, but an option I had clearly missed.\nTips and Best Practices Thinking about what I could do to avoid this problem in the future, a few thoughts come to mind.\n","date":"September 17, 2023","hero":"/posts/too-many-redirects-on-wordpress/featureImage.png","permalink":"https://techtweedie.github.io/posts/too-many-redirects-on-wordpress/","summary":"","tags":["WordPress","Cloudflare","Reverse Proxy"],"title":"Too Many Redirects on WordPress"},{"categories":["Digital Transformation"],"contents":"In today’s remote work environment, the significance of effective team meetings cannot be overstated. These gatherings are pivotal for aligning goals, addressing challenges, and fostering team cohesion.\nHowever, not all meetings are created equal. The introduction of a Thinking Environment, as outlined by Nancy Kline in her book “Time to Think,” offers a transformative approach to traditional meetings. When combined with the structured reflection of retrospectives, this approach can elevate the effectiveness of team meetings.\nThis blog post explores how integrating the principles of a Thinking Environment with retrospectives can create a regular team meeting structure that is both reflective and productive, using a real-life example to illustrate its impact.\nThe Thinking Environment: Foundations for Effective Communication A Thinking Environment prioritises conditions where individuals can think for themselves with clarity and creativity. Kline identifies ten components essential for fostering such an environment:\nAttention, Incisive Questions, Equality, Appreciation, Ease, Encouragement, Feelings, Information, Diversity, and Place. These principles ensure that each team member feels valued, heard, and empowered to contribute their best ideas.\nRetrospectives: Reflecting for Improvement Retrospectives are structured meetings used by teams, especially in agile methodologies, to reflect on their recent work cycle. The goal is to discuss what went well, what didn’t, and how processes can be improved moving forward. While retrospectives focus on reflection, incorporating a Thinking Environment enriches these discussions, making them more inclusive and insightful.\nMerging Concepts: A New Approach to Team Meetings By blending the Thinking Environment with retrospectives, teams can adopt a meeting structure that enhances engagement, creativity, and productivity. This approach involves structuring meetings around key principles of listening, equality, and appreciation, while systematically reflecting on performance and planning for improvement.\nA Real-Life Meeting Example Guidance for creating a Thinking Environment in team meetings emphasizes active listening, non-interruption, and waiting for cues to contribute. Here’s how as an Systems Architecture Manager I applied these principles:\nGuidance for Participation: Participants were advised to minimize distractions, refrain from interrupting, and contribute only when invited or after someone had finished speaking.\nHere were the guidelines; When someone is presenting, everyone is listening, try and minimise everything else on your screens if you can. Don’t interrupt anyone, unless the meeting char is moving the meeting to the next agenda point. Wait until someone invites others thoughts and ideas, or they have indicated they have finished. Agenda Structure: The meeting followed a structured agenda that included reflecting on the past week’s successes and challenges, company updates, current blockers, focus for the next week, learning points, and an open discussion segment.\nHere is the agenda; What’s gone well and not gone well this week? 1 min for each. Company updates What’s blocking you at the moment What’s your focus for next week What are you learning at the moment Anything else anyone wants to discuss Outcomes I found of the New Meeting Structure The adoption of this meeting structure led to significant improvements:\nTeam members felt genuinely heard, as evidenced by the attentive and uninterrupted listening. The structured reflection enabled a comprehensive understanding of what was hindering the team, allowing for targeted solutions. Encouraging the sharing of learning points and next week’s focus fostered a culture of continuous improvement and forward planning. The open discussion segment ensured that all team members had the opportunity to raise additional points, enhancing inclusivity and team cohesion. Overall, the team experienced increased productivity and a stronger sense of collaboration. Conclusion The integration of a Thinking Environment with the reflective practice of retrospectives offers a powerful framework for conducting team meetings. This approach not only ensures that team members feel valued and heard but also facilitates a deeper understanding of team dynamics and challenges. The real-life example provided demonstrates the potential for such meetings to enhance productivity, foster a positive team culture, and drive continuous improvement. By adopting this meeting structure, teams can create a more engaging, effective, and thoughtful working environment that propels them towards their goals.\nRevisions and Publications First published on https://helpmewithmy.technology on 2022-12-18 Migrated to new blog https://techtweedie.github.io on 2024-01-03 ","date":"December 18, 2022","hero":"/posts/thinking-environment-meetings/featureImage.png","permalink":"https://techtweedie.github.io/posts/thinking-environment-meetings/","summary":"","tags":["Work smarter","Thinking Environment","Team Cohesion"],"title":"Thinking Environment Meetings"},{"categories":["How to"],"contents":"Azure functions are a very handy way of getting something small done very quickly. Often these have to be secured and sometimes with an identity provider that is not stright out of the box.\nToday I am going to take you through the steps I took to secure an azure function using Azure B2C.\nStep 1 – Create app in Azure B2C PREREQUISITE Azure Function The Azure Function URL Azure B2C Tenant configured with user flows STEP 1 To start with register an application in Azure B2C.\nIn Redirect URI, (where I have put in https://jwt.ms) enter in the following: https://{azure function url}/.auth/login/aadbc/callback\nOnce you have registered it, open up the application and click on to Authentication. Scroll down and tick both Access tokens and ID tokens.\nNext we will need to create a the Secrets, so click on Certificates \u0026amp; secrets in the left hand menu, and then click on New client secret.\nOnce you click on Add, you will then be able to see the secret, copy it somewhere as we will need it later and you won’t be able to see it again.\nWe also need to grab the client id of the application we have created.\nAlso, while you are in your Azure B2C tenant, grab the URL to the .well-known/openid-configuration. There are a few ways you can get this; one is by going in to one of your user flows used for sign up sign in. Click on Overview, and then Click on Run user flow.\nStep 2 – Add authentication to your Azure Function Navigate to your Azure Function, this won’t be inside your Azure B2C tenant.\nGo to Authentication under Settings in the left-hand menu.\nClick on Add identity provider, and choose OpenID Connect. You will then be presented with a page that looks like this.\nUnder OpenID provider name enter in: aadbc\nMetadata URL enter: the .well-known/openid-configuration URL\nThen make sure that under Restricted access you have chosen Require authentication, and then under Unauthenticated requests you have chosen an option appropriate for your needs.\nStep 3 – Test it Give it a test, try and access something you have placed behind it and it should work 🙂\n","date":"December 14, 2022","hero":"/posts/221214-azure-function-with-azure-b2c/featureImage.png","permalink":"https://techtweedie.github.io/posts/221214-azure-function-with-azure-b2c/","summary":"","tags":["Azure Function","Azure B2C","Azure"],"title":"Azure Function with Azure B2C"},{"categories":["Trubbleshooting"],"contents":"This post is an example of how I posted data to Azure Table storage from Postman.\nPrerequisite You have access to Postman You have an Azure Storage account Steps Go to your Storage Account Click on Tables, and then click on Add table Add table to Azure Storage Account\nYou should now have a table, in my example these are called test and test2.\nNext, generate a shared access signature (SAS), which is a URI that grants restricted access. Generate a SAS and connection string as shown below. Click on Generate SAS and connection string and leave the generated strings on screen, we will need them later.\nNow go to Postman, From postman we are going to do an insert or/and replace request.\nRequest URL: https://\u0026lt;\u0026lt;StorageAccountName\u0026gt;\u0026gt;.table.core.windows.net/\u0026lt;\u0026lt;TableName\u0026gt;\u0026gt;(PartitionKey=’myPartitionKey’, RowKey=’myRowKey’)\nSee example below.\nhttps://ukaxxxdata01.table.core.windows.net/test2(PartitionKey=\u0026#39;A\u0026#39;, RowKey=\u0026#39;44\u0026#39;) Then go back to the screen where you Generated your SAS and look down until you get to SAS token.\nCopy the value, and paste it after the request URL so it forms one ling string. See example below.\nhttps://ukaxxxdata01.table.core.windows.net/test2(PartitionKey=\u0026#39;A\u0026#39;, RowKey=\u0026#39;44\u0026#39;)?sv=2021-06-08\u0026amp;ss=t\u0026amp;srt=o\u0026amp;sp=rwdlacu\u0026amp;se=2022-11-15T21:02:37Z\u0026amp;st=2022-11-15T13:02:37Z\u0026amp;sip=1.1.1.1\u0026amp;spr=https,http\u0026amp;sig=49MrnoRgsbL9bvzIr%2BrEfxpQP3MAAe4rbc1O44NzcHc%3D Method: PUT\nUnder Body choose raw and then JSON from the drop-down menu. Then enter in some sample JSON.\n{ \u0026#34;correlationId\u0026#34;: \u0026#34;0d145550\u0026#34;, \u0026#34;companyId\u0026#34;: \u0026#34;NL-0-02355555\u0026#34;, \u0026#34;dateOfOrder\u0026#34;: \u0026#34;2021-02-04T09:33:36.166Z\u0026#34;, \u0026#34;language\u0026#34;: \u0026#34;en\u0026#34;, \u0026#34;userId\u0026#34;: \u0026#34;10555\u0026#34; } Under Headers enter in as detailed below:\nThen press Send, and you should get a 204 No Content response. Check it works To check it works we go to storage browser, I will do this in Azure.\nClick on the table you just posted to, and then you should see the data you have just sent to the table.\n","date":"November 15, 2022","hero":"/posts/postman-to-azure-table-storage/featureImage.png","permalink":"https://techtweedie.github.io/posts/postman-to-azure-table-storage/","summary":"","tags":["Postman","Azure Table Storage","Azure Storage Account"],"title":"Postman to Azure Table Storage"},{"categories":["Trubbleshooting"],"contents":"I had a problem where I needed to get my JWT token from Azure B2C OAuth 2.0 in order to troubleshoot an issue I was having in getting OAuth 2.0 working.\nSituation I needed to get the JWT token using Postman, decrypt it, and then provide it for troubleshooting the issue.\nHow did I do that Get postman Create a new request, as far as I am aware it makes no difference what type of request you use so in this example, I will use a GET request. Click on Authorization, and then choose Type OAuth 2.0, and then chose Request Headers Next, on the right-hand side of the screen you will see some options for Configuring a New Token Enter in a Token Name, this is just used as a friendly name in postman as far as I am aware. Choose a Grant Type, in this example I am using Authorisation Code. Untick Authorise using browser, and enter in a Callback URL, this must be also configured within your client settings with your provider, in my case within App Registrations within Azure B2C. Enter an Auth URL, in my case this is the authorisation endpoint in Azure B2C. Enter an Access Token URL, in my case this is the token URL from Azure B2C Enter in the Client ID, for Azure B2C I got this from app registrations Enter in the Client Secret, for Azure B2C I got this from app registrations For Scope, enter in the required scopes you want to test with, in my case this was profile openid email Under State enter in anything you want, so for me this was just the word test. Under Client Authorisation, choose Send client credentials in body. I am not sure what this setting does or means. See my image below\nOnce you have filled in everything, click on Get New Access Token, and it will then take you through the login process. Complete the login process and a pop up should appear with an access token. Copy the output for id_token to your clipboard, and then go to https://jwt.io/ Paste the encoded token in, and then the decoded token should appear. ","date":"November 14, 2022","hero":"/posts/221114-azureb2c-oauth-in-postman/featureImage.png","permalink":"https://techtweedie.github.io/posts/221114-azureb2c-oauth-in-postman/","summary":"","tags":["JWT Token","oAuth","Azure B2C","Power Pages","Postman"],"title":"Azure B2C oAuth on Postman"},{"categories":["Power Platform"],"contents":"Introduction Do you want to deploy Power Pages on a custom domain securely without the need to buy a custom certificate? In this blog post, I’m going to show you how to do this quickly and easily acquire a PFX certificate that you can then use to deploy to Power Pages.\nPre-requisite This is a brilliant tool for generating SSL, and it is simple to use and quick to deploy.\nYou can download the tool here; https://www.win-acme.com/\nSteps Make sure you run the wacs.exe as an administrator\nWe are going to make a certificate with full options\nThen we are going to use manual input\nNext, we are going to create verification records manually on our DNS server\nThen choose RSA key\nThen we want to create a PFX archive\nPop in the location where you would like to store the PFX certificate\nCreate a password\nThis is a short lived cert, so we don’t want to save it in the vault\nWe don’t want to take any additional steps\nWe don’t want to store it anywhere else\nIt then pops up with the TXT DNS record you need to add. This will need to be added at your registrar.\nThen remove the record.\nIf you get the error, WindowsCryptographicException try running the programme as administrator.\nYou should have your PFX file now, if you need a hand please get in touch and we can arrange some help.\n","date":"July 25, 2022","hero":"/posts/pfx-certificate-for-free-for-power-pages/featureImage.png","permalink":"https://techtweedie.github.io/posts/pfx-certificate-for-free-for-power-pages/","summary":"","tags":["Power Pages","Custom Domain","PFX Certificate"],"title":"PFX certificate for FREE for Power Pages"},{"categories":["Power Automate","Power Platform"],"contents":"Introduction Working with Power Automate, especially with HTTP triggers, can unlock a lot of potential in automating tasks and workflows. A common requirement is to use query parameters from the trigger URL within your flow. I’ve found a straightforward method to do this and thought it might be helpful to share.\nWhen your flow is triggered via an HTTP request, it might need to behave differently based on the values passed through query parameters. For instance, you may want your flow to process data differently depending on a user ID or a specific action indicated in the URL.\nHow to Access Query Parameters To access these query parameters directly in your flow, you can use a simple expression right after the HTTP request trigger. Here’s how:\nStep 1: Start with the HTTP Request Trigger Your flow should begin with an HTTP request trigger which listens for incoming requests.\nStep 2: Use the Expression to Access Parameters To grab a query parameter, use the expression:\n@triggerOutputs()[‘queries’][‘yourQueryParamName’]\nReplace yourQueryParamName with the name of your query parameter.\nFor example, if your HTTP request includes a query parameter called userId, you can access it like so: @triggerOutputs()[‘queries’][‘userId’]\nThis lets you use the userId in subsequent steps of your flow, such as filtering database queries or customising responses.\nConclusion This method is a straightforward way to make your Power Automate flows more dynamic and responsive to the inputs they receive. I hope this tip helps you as much as it has helped me.\nFeel free to share your experiences or any other useful tips for working with HTTP triggers in Power Automate.\nThis version is less dramatic and focuses directly on the information and steps involved in accessing query parameters within Power Automate. The accompanying image is designed to complement the post by visualizing the concept of automation in a calm and productive workspace.\n","date":"April 3, 2022","hero":"/posts/query-parameters-on-http-trigger/featureImage.png","permalink":"https://techtweedie.github.io/posts/query-parameters-on-http-trigger/","summary":"","tags":["HTTP Trigger","Automation"],"title":"Query Parameters on HTTP Trigger"},{"categories":null,"contents":"Go Notes ","date":"January 1, 0001","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/notes/go/_index.bn/","summary":"\u003ch1 id=\"go-notes\"\u003eGo Notes\u003c/h1\u003e","tags":null,"title":"Go এর নোট সমূহ"},{"categories":null,"contents":"","date":"January 1, 0001","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/notes/_index.bn/","summary":"","tags":null,"title":"নোট সমূহ"},{"categories":null,"contents":"Bash Notes ","date":"January 1, 0001","hero":"/images/default-hero.jpg","permalink":"https://techtweedie.github.io/notes/bash/_index.bn/","summary":"\u003ch1 id=\"bash-notes\"\u003eBash Notes\u003c/h1\u003e","tags":null,"title":"ব্যাশের নোট সমূহ"}]