/
index.xml
1498 lines (1498 loc) · 148 KB
/
index.xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>David Triana</title><link>https://davidtriana.com/</link><description>Recent content on David Triana</description><generator>Hugo -- gohugo.io</generator><language>en-us</language><managingEditor>david@davidtriana.com (David Triana)</managingEditor><webMaster>david@davidtriana.com (David Triana)</webMaster><copyright>2022 David Triana All rights reserved</copyright><lastBuildDate>Wed, 01 May 2024 00:00:00 +0000</lastBuildDate><atom:link href="https://davidtriana.com/index.xml" rel="self" type="application/rss+xml"/><item><title>Logic App VNET Integration Fails with ERROR: There was a conflict. SiteConfig.VnetRouteAllEnabled cannot be modified. Please modify the Site.VnetRouteAllEnabled property</title><link>https://davidtriana.com/posts/2024/logic-app-vnet-integration-error/</link><pubDate>Wed, 01 May 2024 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2024/logic-app-vnet-integration-error/</guid><description><h1 id="scenario">
Scenario
<a class="heading-link" href="#scenario">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>Something changed in Azure in the last week of April 2024. Specifically, one infrastructure as code pipeline that has been working fine for months started failing April 30. More specifically, this pipeline deploys a Logic App Standard Application, then setups VNET Integration for the logic app to be able to call internal and on-premises services.</p>
<p>The command</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-bash" data-lang="bash">az webapp vnet-integration add --resource-group <span style="font-weight:bold;font-style:italic">\
</span><span style="font-weight:bold;font-style:italic"></span> --name <span style="font-weight:bold">$(</span>logicAppName<span style="font-weight:bold">)</span> <span style="font-weight:bold;font-style:italic">\
</span><span style="font-weight:bold;font-style:italic"></span> --vnet <span style="font-weight:bold">$(</span>vnetResourceId<span style="font-weight:bold">)</span> <span style="font-weight:bold;font-style:italic">\
</span><span style="font-weight:bold;font-style:italic"></span> --subnet <span style="font-weight:bold">$(</span>subnetName<span style="font-weight:bold">)</span>
</code></pre></div><p>Fails with the error message</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">ERROR: There was a conflict. SiteConfig.VnetRouteAllEnabled cannot be modified. Please modify the Site.VnetRouteAllEnabled property
</code></pre></div><h2 id="workaround">
Workaround
<a class="heading-link" href="#workaround">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>Not really sure what happened or what changed. This pipeline is reused among many logic apps in different subscriptions and still works for some of the existing logic apps.</p>
<p>Searching for the error message, I was surprised by Bing Chat understanding the problem and suggesting a solution that worked!</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-bash" data-lang="bash">az resource update --resource-group &lt;group-name&gt; --name &lt;app-name&gt; --resource-type <span style="font-style:italic">&#34;Microsoft.Web/sites&#34;</span> --set properties.vnetRouteAllEnabled=[true|false]
</code></pre></div><p>Used that command with true as the value and now my pipeline is back in business.</p></description></item><item><title>Continuous delivery for Azure Workbooks using Azure DevOps</title><link>https://davidtriana.com/posts/2024/cd-workbooks/</link><pubDate>Sun, 28 Apr 2024 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2024/cd-workbooks/</guid><description><h1 id="scenario">
Scenario
<a class="heading-link" href="#scenario">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>Azure Workbooks are great!, with an easy to use graphical designer to put together interactive queries and reports, no need to code, and available directly in the portal, no need to host a new application.</p>
<p>I have been using Azure Workbooks for the last couple of months to show summaries and details on failures for several business applications, some running as Logic Apps Standard, some as Azure Container Apps.</p>
<p>In this blog post I share the way I automate the deployment of these Workbooks, following patterns similar to the ones used in continuous deployment for regular applications. Here the workflow is not really an application in the sense of the need to pull dependencies, build and deploy, but by following these patterns, the need for inspection, approvals and reuse between environments is satisfied.</p>
<h2 id="creating-log-entries-to-populate-the-workbook">
Creating log entries to populate the workbook
<a class="heading-link" href="#creating-log-entries-to-populate-the-workbook">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>This section is only to address the need of data to show in the workflow. Real applications are probably already creating all these logs. The code bellow is a simple console application, starting from <a href="https://learn.microsoft.com/en-us/azure/azure-monitor/app/ilogger?tabs=dotnet6#console-application">Microsoft&rsquo;s sample code</a>, I added console output to be able to check the outcome live while running. The program creates logs entries simulating a process that runs every 5 seconds, doing 5 tasks every time, simulating some tasks taking longer and failures every once in a while.</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-cs" data-lang="cs"><span style="font-weight:bold">using</span> <span style="font-weight:bold">System.Diagnostics</span>;
<span style="font-weight:bold">using</span> <span style="font-weight:bold">Microsoft.ApplicationInsights.Channel</span>;
<span style="font-weight:bold">using</span> <span style="font-weight:bold">Microsoft.ApplicationInsights.Extensibility</span>;
<span style="font-weight:bold">using</span> <span style="font-weight:bold">Microsoft.Extensions.DependencyInjection</span>;
<span style="font-weight:bold">using</span> <span style="font-weight:bold">Microsoft.Extensions.Logging</span>;
<span style="font-weight:bold">using</span> <span style="font-weight:bold">var</span> channel = <span style="font-weight:bold">new</span> InMemoryChannel();
<span style="font-weight:bold">try</span>
{
IServiceCollection services = <span style="font-weight:bold">new</span> ServiceCollection();
services.Configure&lt;TelemetryConfiguration&gt;(config =&gt; config.TelemetryChannel = channel);
services.AddLogging(builder =&gt;
{
<span style="font-style:italic">// Only Application Insights is registered as a logger provider
</span><span style="font-style:italic"></span> builder.AddApplicationInsights(
configureTelemetryConfiguration: (config) =&gt; config.ConnectionString = <span style="font-style:italic">&#34;---the connection string---&#34;</span>,
configureApplicationInsightsLoggerOptions: (options) =&gt; { }
);
builder.AddJsonConsole(options =&gt;
{
options.IncludeScopes = <span style="font-weight:bold">true</span>;
options.TimestampFormat = <span style="font-style:italic">&#34;HH:mm:ss &#34;</span>;
});
});
<span style="">var</span> serviceProvider = services.BuildServiceProvider();
<span style="">var</span> logger = serviceProvider.GetRequiredService&lt;ILogger&lt;Program&gt;&gt;();
<span style="">var</span> cancellationTokenSource = <span style="font-weight:bold">new</span> CancellationTokenSource();
<span style="font-weight:bold">await</span> MainLoop(cancellationTokenSource.Token,
logger, 5000).ConfigureAwait(<span style="font-weight:bold">false</span>);
logger.LogInformation(<span style="font-style:italic">&#34;Logger is working...&#34;</span>);
}
<span style="font-weight:bold">finally</span>
{
<span style="font-style:italic">// Explicitly call Flush() followed by Delay, as required in console apps.
</span><span style="font-style:italic"></span> <span style="font-style:italic">// This ensures that even if the application terminates, telemetry is sent to the back end.
</span><span style="font-style:italic"></span> channel.Flush();
<span style="font-weight:bold">await</span> Task.Delay(TimeSpan.FromMilliseconds(1000));
}
<span style="font-weight:bold">return</span>;
<span style="font-weight:bold">static</span> <span style="font-weight:bold">async</span> Task MainLoop(
CancellationToken cancellationToken,
ILogger&lt;Program&gt; logger,
<span style="">int</span> frequencyInMilliSeconds)
{
<span style="">var</span> totalRuns=0;
<span style="font-weight:bold">while</span> (!cancellationToken.IsCancellationRequested &amp;&amp; totalRuns &lt; 200)
{
<span style="font-style:italic">// Create a child task that runs in parallel
</span><span style="font-style:italic"></span> <span style="">var</span> childTask = Task.Run(<span style="font-weight:bold">async</span> () =&gt;
{
<span style="">var</span> transactionId = DateTime.UtcNow.ToString(<span style="font-style:italic">&#34;yyyyMMddHHmmssfff&#34;</span>);
<span style="font-weight:bold">using</span> (logger.BeginScope(<span style="font-style:italic">&#34;{transactionId}&#34;</span>, transactionId))
{
<span style="">var</span> totalTimeTaken = <span style="font-weight:bold">new</span> Stopwatch();
totalTimeTaken.Start();
logger.LogInformation(
<span style="font-style:italic">&#34;Pest finder {eventTypeName}&#34;</span>,
<span style="font-style:italic">&#34;started&#34;</span>);
<span style="font-weight:bold">try</span>
{
<span style="">var</span> stopWatch = <span style="font-weight:bold">new</span> Stopwatch();
<span style="font-weight:bold">for</span> (<span style="">var</span> i = 1; i &lt; 6; i++)
{
stopWatch.Restart();
<span style="font-weight:bold">await</span> Task.Delay(RandomNumber(1000, 20000), cancellationToken);
<span style="font-weight:bold">if</span> (RandomNumber(0, 100) &gt; 90) <span style="font-weight:bold">throw</span> <span style="font-weight:bold">new</span> Exception(<span style="font-style:italic">&#34;Pest finder overrun!&#34;</span>);
logger.LogInformation(
<span style="font-style:italic">&#34;Pest finder task {taskNumber} took {timeTaken}ms&#34;</span>,
i,
stopWatch.ElapsedMilliseconds);
}
logger.LogInformation(
<span style="font-style:italic">&#34;Pest finder {eventTypeName} and took {totalTimeTaken}ms&#34;</span>,
<span style="font-style:italic">&#34;completed&#34;</span>, totalTimeTaken.ElapsedMilliseconds);
}
<span style="font-weight:bold">catch</span> (Exception ex)
{
logger.LogError(ex,
<span style="font-style:italic">&#34;Pest finder {eventTypeName} with error {errorMessage} and took {totalTimeTaken}ms&#34;</span>,
<span style="font-style:italic">&#34;completed&#34;</span>, ex.Message, totalTimeTaken.ElapsedMilliseconds);
}
}
});
<span style="font-weight:bold">await</span> Task.Delay(frequencyInMilliSeconds, cancellationToken).ConfigureAwait(<span style="font-weight:bold">false</span>);
totalRuns++;
}
}
<span style="font-weight:bold">static</span> <span style="">int</span> RandomNumber(<span style="">int</span> min, <span style="">int</span> max)
{
<span style="">var</span> random = <span style="font-weight:bold">new</span> Random();
<span style="font-weight:bold">return</span> random.Next(min, max);
}
</code></pre></div><h2 id="creating-the-workbook">
Creating the workbook
<a class="heading-link" href="#creating-the-workbook">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>Now that we have some logs in place, its time to create the workbook. In the Azure Portal, find the application insights instance which was receiving these logs, then navigate to workbooks</p>
<p><img src="https://davidtriana.com/images/posts/2024/workbooks-navigate.png" alt="Navigate to workbooks link"></p>
<p>The workbooks screen offers two templates, select the default template</p>
<p><img src="https://davidtriana.com/images/posts/2024/workbooks-select-default-template.png" alt="Template options"></p>
<p>Workbooks are composed of blocks, added vertically one after the other. The default template adds two blocks, one text block and one query block</p>
<p><img src="https://davidtriana.com/images/posts/2024/workbooks-initial-template.png" alt="Default template"></p>
<p>I updated the query in the query block according to the logs I created</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">union * | where customDimensions.eventTypeName == &#39;completed&#39; | summarize count() by bin(timestamp, 1m), itemType
| render barchart
</code></pre></div><p><img src="https://davidtriana.com/images/posts/2024/workbooks-first-query.png" alt="First query"></p>
<p>And this already shows the value of workbooks!, the ability to produce very nice reports and summaries from logs, without code and without deploying an additional application. This first query shows a summary.</p>
<p>I used the +Add option to add a new query block</p>
<p><img src="https://davidtriana.com/images/posts/2024/workbooks-add-query.png" alt="add query"></p>
<p>and in this next query I&rsquo;m getting the details of all the processes that failed</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">exceptions | project timestamp, TransactionId = customDimensions.transactionId, Error = customDimensions.errorMessage
</code></pre></div><p>To add some interactivity I configured the &ldquo;export parameter&rdquo; feature so that when a row from the results is selected, the selected value is made available as a parameter for the next query</p>
<p><img src="https://davidtriana.com/images/posts/2024/workbooks-second-query-add-parameter.png" alt="Adding parameters"></p>
<p>Next I added the last query, to show all the log entries for that process run</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">union traces,exceptions | where customDimensions.transactionId == {txid}
| order by timestamp asc
| project timestamp, message = iif(itemType==&#39;trace&#39;, message, customDimensions.FormattedMessage), timetaken = customDimensions.timeTaken
</code></pre></div><p>The final result looks like this</p>
<p><img src="https://davidtriana.com/images/posts/2024/workbooks-end-result.png" alt="Final result"></p>
<p>Workbooks also allow &ldquo;link&rdquo; columns, with the ability to directly open Azure Portal blades or invoke actions, as shown <a href="https://techcommunity.microsoft.com/t5/azure-integration-services-blog/extending-logic-apps-app-insight-integration-with-azure/ba-p/3784062">here</a> for Logic Apps Standard where the workbook includes links to the run details and to the &ldquo;resubmit&rdquo; action for the workflow.</p>
<h2 id="creating-the-pipeline">
Creating the pipeline
<a class="heading-link" href="#creating-the-pipeline">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>The workbook is now in place and working, this however was done directly in the Azure Portal, manually, something that I don&rsquo;t want to repeat for my TEST, STAGING, PERF or PRODUCTION environments. Just like with code, ideally, this needs to be version managed and go though deployment pipelines that parameterize by environment when needed, and control the approvals process to promote from lower to higher environments.</p>
<p>The workbooks user interface in the portal already provides some help for this by producing the ARM template needed to deploy the workbook via the Azure CLI. To obtain the ARM template, in the workbook editor view select the advanced editor</p>
<p><img src="https://davidtriana.com/images/posts/2024/workbooks-advanced-editor-button.png" alt="Advanced editor"></p>
<p>Then use the ARM template from the options offered by the Azure Portal</p>
<p><img src="https://davidtriana.com/images/posts/2024/workbooks-export-options.png" alt="Export options"></p>
<p>This option works, however the content of the workbook is all in one single place, the serializedData property, which makes it harder to inspect when thinking about code reviews and pull requests. The option I use is the first one, the &ldquo;Gallery template&rdquo; option, which provides the full content of the workbook in an easy to read and inspect JSON format.</p>
<p>To use this option I save this content as a JSON file, which I then add to source control, then pull during deployment using the loadTextContent bicep function.</p>
<p>Assuming separate subscriptions per environment, my bicep template looks like this, and a critical piece is the uniqueness of the name of the workflow. This name needs to be unique, and the same between pipeline runs to ensure the workbook is updated instead of adding a new workbook. The GUID function accepts as many parameters as needed so depending on the project I might need to add more parameters to make it unique.</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">@description(&#39;The datacenter to use for the deployment.&#39;)
param location string = resourceGroup().location
param environmentName string
param environmentShortNameUpperCase string = toUpper(environmentName)
param workbookSourceId string = &#39;/subscriptions/${subscription().subscriptionId}/resourceGroups/${resourceGroup().name}/providers/microsoft.insights/components/appinsights-${environmentName}&#39;
resource existing_application_insights &#39;Microsoft.Insights/components@2020-02-02&#39; existing = {
name: &#39;appinsights-${environmentName}&#39;
scope: resourceGroup()
}
resource ProcessRunsSummaryWorkbook &#39;Microsoft.Insights/workbooks@2023-06-01&#39; = {
name: guid(subscription().id, resourceGroup().id, existing_application_insights.id)
location: location
tags: {
costCenter: &#39;Demos&#39;
project: &#39;Demos&#39;
}
kind: &#39;shared&#39;
properties: {
category: &#39;workbook&#39;
displayName: &#39;Pest control runs - ${environmentShortNameUpperCase}&#39;
serializedData: loadTextContent(&#39;PestControlWorkbook.json&#39;)
sourceId: workbookSourceId
version: &#39;1.0&#39;
}
dependsOn: [
existing_application_insights
]
}
</code></pre></div><p>To deploy this bicep template I use the Azure Devops deployment task</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-yaml" data-lang="yaml">- <span style="font-weight:bold">task</span>: AzureResourceManagerTemplateDeployment@3
<span style="font-weight:bold">displayName</span>: <span style="font-style:italic">&#39;Deploy Workbook&#39;</span>
<span style="font-weight:bold">inputs</span>:
<span style="font-weight:bold">azureResourceManagerConnection</span>: ${{ parameters.serviceConnection }}
<span style="font-weight:bold">subscriptionId</span>: <span style="font-style:italic">&#39;$(subscriptionId)&#39;</span>
<span style="font-weight:bold">action</span>: <span style="font-style:italic">&#39;Create Or Update Resource Group&#39;</span>
<span style="font-weight:bold">resourceGroupName</span>: $(resourceGroupName)
<span style="font-weight:bold">location</span>: $(resourceGroupLocation)
<span style="font-weight:bold">csmFile</span>: <span style="font-style:italic">&#39;$(Pipeline.Workspace)/$(artifactName)/template-workbooks.bicep&#39;</span>
<span style="font-weight:bold">overrideParameters</span>: &gt;-<span style="font-style:italic">
</span><span style="font-style:italic"> </span> -environmentName $(environmentShortName)
<span style="font-weight:bold">deploymentMode</span>: <span style="font-style:italic">&#39;Incremental&#39;</span>
</code></pre></div><p>Which is called by a multistage pipeline that takes care of each of the environments, and where a typical stage looks like this</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-yaml" data-lang="yaml">- <span style="font-weight:bold">stage</span>: STAGING
<span style="font-weight:bold">displayName</span>: <span style="font-style:italic">&#39;STAGING Deployment&#39;</span>
<span style="font-weight:bold">variables</span>:
- <span style="font-weight:bold">template</span>: pipeline-variables.yml
<span style="font-weight:bold">parameters</span>:
<span style="font-weight:bold">environmentShortName</span>: <span style="font-style:italic">&#39;stg&#39;</span>
<span style="font-weight:bold">subscriptionId</span>: <span style="font-style:italic">&#39;---my guid---&#39;</span>
<span style="font-weight:bold">jobs</span>:
- <span style="font-weight:bold">template</span>: templates/iac-template.yml
<span style="font-weight:bold">parameters</span>:
<span style="font-weight:bold">azDevOpsEnvironment</span>: <span style="font-style:italic">&#39;Pest Control Staging&#39;</span>
<span style="font-weight:bold">serviceConnection</span>: <span style="font-style:italic">&#39;azure-staging-service-connection&#39;</span>
</code></pre></div><p>A complete example of multistage pipelines for infrastructure as code (IaC) and continuous integration (CI) and continuous delivery (CD) can be found in the <a href="https://github.com/Azure/logicapps/tree/master/azure-devops-sample/.pipelines/classic">Microsoft guidance for DevOps with Azure Standard Logic Apps</a></p></description></item><item><title>Dynamic Loops In Azure DevOps Pipelines</title><link>https://davidtriana.com/posts/2024/azdevops.loops/</link><pubDate>Fri, 26 Jan 2024 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2024/azdevops.loops/</guid><description><h1 id="scenario">
Scenario
<a class="heading-link" href="#scenario">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>Being used to programming, loops feel very basic, like, for each one of these, do that. However, doing this in AzDevOps pipelines turned out to be not as easy.</p>
<p>It is trivial, for a previously know set, like an array of environments [&lsquo;dev,&lsquo;qa&rsquo;,&lsquo;prod&rsquo;], no problem there, but what if the set is determined dynamically, while the pipeline is running?</p>
<p>This is exactly my need. I have a container app, publicly exposed to the internet, with a set of IP rules so that it can only be called by a Logic App. The thing is, the outgoing IP of the Logic App is not a single one and can change from deployment to deployment, so I need to obtain the list of IPs, then run a loop on those IP, to add them as rules.</p>
<p>Of course I can deploy both the container app and the logic app in a VNET to avoid the need of these rules, not there yet unfortunately, other requirements and limitations make it necessary to keep these public.</p>
<p>So why is not as easy?</p>
<p>The Azure Devops loop support relies on the each keyword explained <a href="https://learn.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#each-keyword">here</a>,</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-yaml" data-lang="yaml"><span style="font-weight:bold">parameters</span>:
- <span style="font-weight:bold">name</span>: listOfStrings
<span style="font-weight:bold">type</span>: object
<span style="font-weight:bold">default</span>:
- one
- two
<span style="font-weight:bold">steps</span>:
- ${{ each value in parameters.listOfStrings }}:
- <span style="font-weight:bold">script</span>: echo ${{ value }}
</code></pre></div><p>However, the <code>${{ }}</code> syntax is not for runtime. Those expressions are expanded before running the workflow as explained <a href="https://stackoverflow.com/a/75832425">here</a>,</p>
<h1 id="using-bash-inside-the-task-to-run-the-loop">
Using bash inside the task to run the loop
<a class="heading-link" href="#using-bash-inside-the-task-to-run-the-loop">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>The way I worked around this is by using bash loops, and the same can be accomplished with PowerShell,</p>
<p>My code:</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-yaml" data-lang="yaml">- <span style="font-weight:bold">task</span>: AzureCLI@2
<span style="font-weight:bold">displayName</span>: Add Ip Security Allow Logic Apps
<span style="font-weight:bold">name</span>: AddIpSecurityRestrictionsAllowLAs
<span style="font-weight:bold">inputs</span>:
<span style="font-weight:bold">azureSubscription</span>: myServiceConnectionName
<span style="font-weight:bold">scriptType</span>: <span style="font-style:italic">&#39;bash&#39;</span>
<span style="font-weight:bold">scriptLocation</span>: <span style="font-style:italic">&#39;inlineScript&#39;</span>
<span style="font-weight:bold">inlineScript</span>: |<span style="font-style:italic">
</span><span style="font-style:italic"> MSYS_NO_PATHCONV=1
</span><span style="font-style:italic"> echo &#34;Full Ip List is $(GetLogicAppsOutgoingIps.logicApps_outgoing)&#34;
</span><span style="font-style:italic"> IFS=&#39;,&#39; read -ra la_ips &lt;&lt;&lt; &#34;$(GetLogicAppsOutgoingIps.logicApps_outgoing)&#34;
</span><span style="font-style:italic"> for i in &#34;${la_ips[@]}&#34;;
</span><span style="font-style:italic"> do
</span><span style="font-style:italic"> DATE=$(date &#39;+%Y%m%d%H%M%S&#39;)
</span><span style="font-style:italic"> echo &#34;Adding rule for $i&#34;
</span><span style="font-style:italic"> az containerapp ingress access-restriction set \
</span><span style="font-style:italic"> --name theNameOfMyContainerApp \
</span><span style="font-style:italic"> --resource-group theNameOfTheResourceGroup \
</span><span style="font-style:italic"> --rule-name &#34;Allow My Logic App $DATE&#34; \
</span><span style="font-style:italic"> --ip-address &#34;$i/32&#34; \
</span><span style="font-style:italic"> --action Allow
</span><span style="font-style:italic"> done</span>
</code></pre></div><p>I&rsquo;m adding the date to make the name unique, and the syntax to split the string and the loop in bash is explained <a href="https://stackoverflow.com/a/918931">here</a>.</p>
<p>The <code>GetLogicAppsOutgoingIps.logicApps_outgoing</code> variable is set by a template that runs another AzureCLI bash script, that queries the outgoing IPs, and creates an array with that information,</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-bash" data-lang="bash">logicAppIps=<span style="font-weight:bold">$(</span> az logicapp show -n theLogicAppName -g theResourceGroupName --query <span style="font-style:italic">&#39;outboundIpAddresses&#39;</span> -o tsv<span style="font-weight:bold">)</span>
<span style="font-weight:bold">if</span> [[ $logicAppIps == <span style="font-style:italic">&#34;&#34;</span> ]]; <span style="font-weight:bold">then</span>
exit 1
<span style="font-weight:bold">fi</span>
echo <span style="font-style:italic">&#34;##vso[task.setvariable variable=logicApps_outgoing;isoutput=true;isreadonly=true;]</span>$logicAppIps<span style="font-style:italic">&#34;</span>
</code></pre></div><p>And there it is!, a dynamic loop in an Azure Devops pipeline.</p></description></item><item><title>Logic Apps Standard DB2 Query and Parameters</title><link>https://davidtriana.com/posts/2024/db2-queries/</link><pubDate>Fri, 12 Jan 2024 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2024/db2-queries/</guid><description><h1 id="scenario">
Scenario
<a class="heading-link" href="#scenario">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>In Logic Apps Standard we now have this in-app DB2 actions, no need for managed connections, just make DB2 calls directly from the Logic App.</p>
<p><img src="https://davidtriana.com/images/posts/2024/add_db2_actions.png" alt="DB2 Shapes"></p>
<p>While using these shapes, the execute query in particular, it took me a while to figure out how to set the parameter for the query.</p>
<h1 id="adjusting-the-query-and-parameter">
Adjusting the query and parameter
<a class="heading-link" href="#adjusting-the-query-and-parameter">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>The way it worked for me is using the ? sign for the parameter, and a numeric index for the parameter definition.</p>
<p><img src="https://davidtriana.com/images/posts/2024/db2_query.png" alt="DB2 Query"></p>
<p>This was after trying named parameters, the @ sign, and reading the links on the first two pages of Bing results for this question without any luck. This solution worked for me, however it might be different for you depending on the underlying database and its configuration, in my case this is against an Informix 14 database.</p></description></item><item><title>Logging HTTP Requests in .NET Core API</title><link>https://davidtriana.com/posts/2023/logging-http-requests/</link><pubDate>Tue, 14 Nov 2023 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2023/logging-http-requests/</guid><description><h1 id="scenario">
Scenario
<a class="heading-link" href="#scenario">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>Troubleshooting Web APIs, especially when released to the world, often require the submitted payload, to be able to reproduce the issue. HTTP payloads are not captured by logging be default, and it should be that way to avoid introducing performance and privacy issues, however, when needed, there are many ways to do it.</p>
<p>Some teams will explicitly add code to log the request, for each of the requests where the payload is needed. This is good, but adds a lot of code to maintain, at least one line per API method.</p>
<p>Other teams will create a generic logging mechanism, introduced in runtime via a wrapper on each call or any of the middleware interceptors available in .NET. This is good since it can be maintained in a single place, remains custom, additional code.</p>
<p>My team, looking for ways to add as little code as possible, rely on the build it HttpLogging service. Not customizable as with custom code, but its build need, no need for custom logging code for it to work.</p>
<h1 id="adding-httplogging">
Adding HTTPLogging
<a class="heading-link" href="#adding-httplogging">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>To add HTTPLogging to a .NET 7 Core APIs project:</p>
<ol>
<li>In program.cs inside the static main method, when declaring the services but before creating the app</li>
</ol>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback"> builder.Services.AddHttpLogging(logging =&gt;
{
logging.LoggingFields = HttpLoggingFields.RequestBody;
logging.RequestBodyLogLimit = 1024;
});
</code></pre></div><p>The LoggingFields properties is a flags property, other fields can be added, here I&rsquo;m only interested in the request body, other characteristics of the request like the URL and Method are being captured already by the request type logs. The Limit property, which is in bytes, is to prevent polluting the log with very long payloads, either accidental, malicious, or real with endpoints where users submit images.</p>
<ol>
<li>In program.cs inside the static main method, after creating the app</li>
</ol>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">app.UseWhen(
context =&gt; !(
context.Request.Path.StartsWithSegments(&#34;/api/user/login&#34;) ||
context.Request.Path.StartsWithSegments(&#34;/api/user/password&#34;)),
builder =&gt; builder.UseHttpLogging());
</code></pre></div><p>Here I use the UseWhen filter, to prevent logging passwords. In general, logging secrets or any kind of confidential information should be prevented. More complex APIs might require much more sophisticated mechanisms to avoid privacy incidents while keeping the ability to troubleshoot problematic HTTP requests.</p>
<ol>
<li>In appsettings.json, in the Logging section, add the relevant lines for <code>Microsoft.AspNetCore.HttpLogging.HttpLoggingMiddleware</code> based on</li>
</ol>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback"> &#34;Logging&#34;: {
&#34;LogLevel&#34;: {
&#34;Default&#34;: &#34;Information&#34;,
&#34;Microsoft.AspNetCore.HttpLogging.HttpLoggingMiddleware&#34;: &#34;Information&#34;,
&#34;Microsoft.AspNetCore&#34;: &#34;Warning&#34;,
&#34;Microsoft&#34;: &#34;Warning&#34;,
&#34;Microsoft.Hosting.Lifetime&#34;: &#34;Information&#34;
},
&#34;ApplicationInsights&#34;: {
&#34;LogLevel&#34;: {
&#34;Default&#34;: &#34;Warning&#34;,
&#34;Microsoft.AspNetCore.HttpLogging.HttpLoggingMiddleware&#34;: &#34;Information&#34;
}
}
},
</code></pre></div><p>The ApplicationInsights section is only needed if that&rsquo;s where you want your logs to go to.</p>
<h1 id="finding-the-logs-in-application-insights">
Finding the logs in Application Insights
<a class="heading-link" href="#finding-the-logs-in-application-insights">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>Running in Visual Studio or looking at the logs stream in Azure App Services will show the entries. In Application Insights the logs will be of type Trace, severity information, and the message starts with RequestBody, for example</p>
<table>
<thead>
<tr>
<th>Local Time</th>
<th>Type</th>
<th>Details</th>
</tr>
</thead>
<tbody>
<tr>
<td>9:41:40.179 AM</td>
<td>Trace</td>
<td><strong>Severity level:</strong> Information, <strong>Message:</strong> RequestBody: {&ldquo;productId&rdquo;:&ldquo;INFANTRYALPHA2023&rdquo;,&ldquo;quantity&rdquo;:3}</td>
</tr>
</tbody>
</table></description></item><item><title>Listing Azure Function Apps Framework Version</title><link>https://davidtriana.com/posts/2022/listing-azure-functionapps-framework/</link><pubDate>Tue, 15 Nov 2022 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2022/listing-azure-functionapps-framework/</guid><description><h1 id="scenario">
Scenario
<a class="heading-link" href="#scenario">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>Azure Function Apps running with .NET 3.1 will go out of support on Dec 3 2022, something that the Azure Portal reminds users every time they visit their outdated functions. If like me, you have several functions, whouldn&rsquo;t it be nice to be able to get a list of all the functions that should be updated?</p>
<h1 id="solution">
Solution
<a class="heading-link" href="#solution">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>A simple <code>az functionapps list</code> should do the trick, however the properties that provide this information are in one of the objects, siteConfig, which is not populated by the <code>list</code> command, and after searching for a while I found <a href="https://github.com/Azure/azure-cli/issues/21548#issuecomment-1061634921">this solution for WebApps</a>, the command for Function Apps is</p>
<p><code>az functionapp list --query '[].id' -o tsv | xargs | xargs -I{} bash -c &quot;az functionapp config show --ids {} --query '[].[name, resourceGroup,linuxFxVersion, netFrameworkVersion]' --out table&quot;</code></p>
<p>This command will list all the functions in the current subscription, provided proper permissions.</p>
<p>The output looks like this:</p>
<table>
<thead>
<tr>
<th>Column1</th>
<th>Column2</th>
<th>Column3</th>
<th>Column4</th>
</tr>
</thead>
<tbody>
<tr>
<td>tseries-urban-cars</td>
<td>cars-classifier</td>
<td>DOTNET|6.0</td>
<td>v6.0</td>
</tr>
<tr>
<td>tseries-urban-buildings</td>
<td>buildings-location</td>
<td>v4.0</td>
<td></td>
</tr>
<tr>
<td>tseries-urban-weapons</td>
<td>weapons-catalog</td>
<td>DOTNET|6.0</td>
<td>v6.0</td>
</tr>
<tr>
<td>targets</td>
<td>targets-catalog</td>
<td>DOTNET|3.1</td>
<td>v4.0</td>
</tr>
</tbody>
</table>
<p>Rows 1 and 3 were updated already, row 2 is not .NET, and the last needs to be updated</p></description></item><item><title>Adding a COVID certificate link to the iPhone home screen</title><link>https://davidtriana.com/posts/2021/adding-covid-link-home/</link><pubDate>Wed, 01 Dec 2021 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2021/adding-covid-link-home/</guid><description><h1 id="iphone-home-screen-link">
iPhone home screen link
<a class="heading-link" href="#iphone-home-screen-link">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>Here in Colombia our COVID vaccine certificates are available as a PDF document. Opening a PDF in the phone is easy, but what about finding the PDF in the first place? That&rsquo;s what this post is about, creating a home screen link, like for any other app, to open the PDF</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_18.png" alt="&ldquo;Home Screen&rdquo;" title="Home Screen"></p>
<ol>
<li>First, make sure to have the PDF available directly on the phone. In my case, I have the PDF available in the OneDrive App, but since I want this to work even offline, I need to copy the PDF locally</li>
</ol>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_01.jpg" alt="&ldquo;Share File&rdquo;" title="Share File"></p>
<p>Click on the top right to open the share dialog, then click on &ldquo;Share File via&hellip;&rdquo;</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_02.jpg" alt="&ldquo;Save to files&rdquo;" title="Save to files"></p>
<p>Click save to files</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_03.jpg" alt="&ldquo;My Iphone&rdquo;" title="My Iphone"></p>
<p>Then select &ldquo;On My iPhone</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_04.png" alt="&ldquo;Shortcuts&rdquo;" title="Shortcuts"></p>
<p>Next, look for the &ldquo;Shortcuts&rdquo; app</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_05.png" alt="&ldquo;Shortcuts&rdquo;" title="Shortcuts"></p>
<p>Click the &lsquo;+&rsquo; symbol in the top right to create a new shortcut</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_07.png" alt="&ldquo;Gliph&rdquo;" title="Gliph"></p>
<p>Select the color and the graphic</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_08.png" alt="&ldquo;Name&rdquo;" title="Name"></p>
<p>Give it a name, then click &ldquo;Add Action&rdquo;</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_09.png" alt="&ldquo;Documents&rdquo;" title="Documents"></p>
<p>In Categories click Documents</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_10.png" alt="&ldquo;Open File&rdquo;" title="Open File"></p>
<p>Then click Open File</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_12.png" alt="&ldquo;Click File&rdquo;" title="Click File"></p>
<p>Click File</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_13.jpg" alt="&ldquo;Open File&rdquo;" title="Open File"></p>
<p>Then locate the file you saved on the first step. Should be right there since this user interface shows recent files</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_14.png" alt="&ldquo;Default App&rdquo;" title="Default App"></p>
<p>Click Default App and select Files</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_15.png" alt="&ldquo;Properties&rdquo;" title="Properties"></p>
<p>Now click on the settings button on the top right next to the close button</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_16.png" alt="&ldquo;add to home&rdquo;" title="add to home"></p>
<p>Click Add To Home Screen</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_17.png" alt="&ldquo;add&rdquo;" title="add"></p>
<p>Click Add in the top right</p>
<p><img src="https://davidtriana.com/images/posts/2021/covid_card_18.png" alt="&ldquo;new icon&rdquo;" title="new icon"></p>
<p>And there you have it!, a new home screen link to open the PDF</p></description></item><item><title>Renewing Microsoft Certifications</title><link>https://davidtriana.com/posts/2021/renewing-microsoft-certifications/</link><pubDate>Mon, 18 Oct 2021 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2021/renewing-microsoft-certifications/</guid><description><h1 id="action-required---your-microsoft-certification-will-expire-in-180-days">
Action required - Your Microsoft Certification will expire in 180 days!
<a class="heading-link" href="#action-required---your-microsoft-certification-will-expire-in-180-days">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>That email subject, from May 21, asked for action, to get one of my certifications renewed, or &ldquo;If your certification expires, you must earn the certification again by passing the required certification exam(s)&rdquo;</p>
<p>This renewal process is <a href="https://docs.microsoft.com/en-us/learn/certifications/renew-your-microsoft-certification">new in many ways</a>, main differences are</p>
<ul>
<li>The topics covered are not necessarily the same as for the original exam/certification</li>
<li>The exam itself is simpler than the proctored exam, no proctor, no dedicated exam app</li>
</ul>
<p>I renewed 3 certifications with this process,</p>
<ul>
<li><a href="https://docs.microsoft.com/en-us/learn/certifications/azure-developer/renew/">Azure Developer Associate</a></li>
<li><a href="https://docs.microsoft.com/en-us/learn/certifications/devops-engineer/renew/">DevOps Engineer Expert</a></li>
<li><a href="https://docs.microsoft.com/en-us/learn/certifications/azure-solutions-architect/renew/">Azure Solutions Architect Expert</a></li>
</ul>
<p>I passed all on the first try, however the renewal exams can be repeated as many times as needed</p>
<p><img src="https://davidtriana.com/images/posts/2021/exams.png" alt="&ldquo;Exams&rdquo;" title="Exams"></p>
<p>To prepare, I went through all the linked materials on each of the certification pages, all of which go to the Microsoft Learn site, and include theory, labs and exam. I highly recommend this approach because</p>
<ul>
<li>The liked materials are directly related to the questions in the renewal exam</li>
<li>The Microsoft Learn materials and labs are very well made, up to date and relevant</li>
</ul>
<p>During the renewal exam, without a proctor or dedicated app, nothing prevents you from navigating back to the learning materials or searching for the answers online. While this approach might work to renew the certification, I think that misses the point. I think going through the materials preparing for the exam, and then consciously answering the exam questions fulfills the &ldquo;brush up on your skills&rdquo; objective in the invitation email.</p>
<p>About the individual exams, what I found is that all the topics are covered in the learning materials and a lot of the topics from the original exams are not part of the renewal. The questions are less &ldquo;tricky&rdquo; than in the proctored exams and all the questions are multiple selections, no labs, no &ldquo;order the commands&rdquo; or any other of the question types available in the proctored exams.</p>
<h1 id="see-more">
See More
<a class="heading-link" href="#see-more">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>Something that happened to me while preparing for the DevOps exam, opening the links to the learning materials, is that I missed some of the materials. While doing the exam I got some questions for which I wasn&rsquo;t prepared, then after the exam I went back to the page and noticed the &ldquo;See More&rdquo; link at the bottom of the list of training materials, same link is present in the Architect Expert exam, and I learned my lesson.</p>
<h1 id="celebrate">
Celebrate
<a class="heading-link" href="#celebrate">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p><img src="https://davidtriana.com/images/posts/2021/celebrate.png" alt="&ldquo;Celebrate&rdquo;" title="Celebrate"></p></description></item><item><title>Querying Azure DevOps Pipelines YAML Files</title><link>https://davidtriana.com/posts/2021/querying-pipelines-yaml-files/</link><pubDate>Fri, 14 May 2021 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2021/querying-pipelines-yaml-files/</guid><description><h1 id="scenario">
Scenario
<a class="heading-link" href="#scenario">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>Your project has tens or hundreds of Azure DevOps Pipelines, and you need to determine if a particular YAML definition is used for any of the pipelines.</p>
<p>The Azure DevOps user interface provides this information, but it requires going pipeline by pipeline since the current experience doesn&rsquo;t shows it in the pipelines list.</p>
<h2 id="querying-pipeline-details-with-the-azure-cli">
Querying pipeline details with the Azure CLI
<a class="heading-link" href="#querying-pipeline-details-with-the-azure-cli">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>The <a href="https://docs.microsoft.com/en-us/cli/azure/pipelines?view=azure-cli-latest#az_pipelines_show">Azure CLI allows querying pipelines</a>, for example, in PowerShell,</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-powershell" data-lang="powershell">az pipelines show --id 321
</code></pre></div><p>Returns a JSON with the details for the pipeline with Id 321</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-json" data-lang="json">{
<span style="font-weight:bold">&#34;__comment&#34;</span>: <span style="font-style:italic">&#34;Many properties omitted for brevity&#34;</span>,
<span style="font-weight:bold">&#34;authoredBy&#34;</span>: {
<span style="font-weight:bold">&#34;descriptor&#34;</span>: <span style="font-style:italic">&#34;aad.redacted&#34;</span>,
<span style="font-weight:bold">&#34;directoryAlias&#34;</span>: <span style="font-weight:bold">null</span>,
<span style="font-weight:bold">&#34;displayName&#34;</span>: <span style="font-style:italic">&#34;Miles Dyson&#34;</span>,
<span style="font-weight:bold">&#34;id&#34;</span>: <span style="font-style:italic">&#34;redacted-guid&#34;</span>,
<span style="font-weight:bold">&#34;imageUrl&#34;</span>: <span style="font-style:italic">&#34;https://dev.azure.com/cyberdyne/_apis/GraphProfile/MemberAvatars/aad.redacted&#34;</span>,
<span style="font-weight:bold">&#34;uniqueName&#34;</span>: <span style="font-style:italic">&#34;mdyson@cyberdyne.com&#34;</span>,
<span style="font-weight:bold">&#34;url&#34;</span>: <span style="font-style:italic">&#34;https://redactedurl&#34;</span>
},
<span style="font-weight:bold">&#34;createdDate&#34;</span>: <span style="font-style:italic">&#34;1997-08-04T16:41:48.533000+00:00&#34;</span>,
<span style="font-weight:bold">&#34;name&#34;</span>: <span style="font-style:italic">&#34;Skynet Infra&#34;</span>,
<span style="font-weight:bold">&#34;options&#34;</span>: <span style="font-weight:bold">null</span>,
<span style="font-weight:bold">&#34;path&#34;</span>: <span style="font-style:italic">&#34;\\Infra\\CD\\Skynet&#34;</span>,
<span style="font-weight:bold">&#34;process&#34;</span>: {
<span style="font-weight:bold">&#34;type&#34;</span>: 2,
<span style="font-weight:bold">&#34;yamlFilename&#34;</span>: <span style="font-style:italic">&#34;pipelines/Cyberdyne-Infra/Skynet.yml&#34;</span>
},
<span style="font-weight:bold">&#34;processParameters&#34;</span>: <span style="font-weight:bold">null</span>,
<span style="font-weight:bold">&#34;project&#34;</span>: {
<span style="font-weight:bold">&#34;id&#34;</span>: <span style="font-style:italic">&#34;redacted-guid&#34;</span>,
<span style="font-weight:bold">&#34;lastUpdateTime&#34;</span>: <span style="font-style:italic">&#34;1997-08-04T20:43:19.643Z&#34;</span>,
<span style="font-weight:bold">&#34;name&#34;</span>: <span style="font-style:italic">&#34;Skynet&#34;</span>,
<span style="font-weight:bold">&#34;revision&#34;</span>: 78,
<span style="font-weight:bold">&#34;state&#34;</span>: <span style="font-style:italic">&#34;wellFormed&#34;</span>,
<span style="font-weight:bold">&#34;url&#34;</span>: <span style="font-style:italic">&#34;https://dev.azure.com/cyberdyne/_apis/projects/redactedguid&#34;</span>,
<span style="font-weight:bold">&#34;visibility&#34;</span>: <span style="font-style:italic">&#34;private&#34;</span>
},
}
</code></pre></div><p>Since we are only interested in the yamlFilename, the &ndash;query parameter allows filtering, for example</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-powershell" data-lang="powershell">az pipelines show --id 321 --query <span style="font-weight:bold">process</span>.yamlFilename
</code></pre></div><p>Returns the YAML file for the pipeline with Id 321</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">&#34;pipelines/Cyberdyne-Infra/Skynet.yml&#34;
</code></pre></div><h2 id="looping-over-all-pipelines">
Looping over all pipelines
<a class="heading-link" href="#looping-over-all-pipelines">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>Now that we know how to get the information, we need it for every pipeline, so first, get a list of all pipeline Ids,</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-powershell" data-lang="powershell">$pipelines = az pipelines list --query [].id | ConvertFrom-Json
</code></pre></div><p>Then iterate over the list, to get the details for each of the pipelines, and save it to a file</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-powershell" data-lang="powershell"><span style="font-weight:bold">ForEach</span> ($pipeline <span style="font-weight:bold">in</span> $pipelines) {az pipelines show --id $pipeline --query [id,name,process.yamlFilename]} | Out-File c:\pipelines.txt
</code></pre></div><h2 id="enjoy">
Enjoy!
<a class="heading-link" href="#enjoy">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>Open the resulting text file in your preferred text editor, and easily find the pipelines using the YAML file of interest</p></description></item><item><title>Deploying Log4Brains ADRs in Azure Static Sites</title><link>https://davidtriana.com/posts/2021/deploying-log4brains-adr-in-azure-static-sites/</link><pubDate>Mon, 03 May 2021 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2021/deploying-log4brains-adr-in-azure-static-sites/</guid><description><h1 id="scenario">
Scenario
<a class="heading-link" href="#scenario">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>Suppose your project has decided to use <a href="https://github.com/thomvaill/log4brains">Log4Brains</a> to record architectural decisions and now you need to make the resulting website available, in Microsoft Azure, via a CD Pipeline. To learn about <a href="https://en.wikipedia.org/wiki/Architectural_decision#Decision_documentation">ADRs</a> or architectural decision records I highly recommend <a href="https://ardalis.com/getting-started-with-architecture-decision-records/">Steve Smith&rsquo;s post about it</a>.</p>
<h2 id="install-log4brains-and-create-your-adrs">
Install Log4Brains and create your ADRs
<a class="heading-link" href="#install-log4brains-and-create-your-adrs">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<ul>
<li>Install <a href="https://github.com/thomvaill/log4brains#what-are-the-prerequisites">Log4Brains&rsquo;s prerequisites</a> to which I will add <a href="https://code.visualstudio.com/Download">VsCode</a>, with the <a href="https://marketplace.visualstudio.com/items?itemName=DavidAnson.vscode-markdownlint">.md extension</a></li>
<li>Install <a href="https://github.com/thomvaill/log4brains#-getting-started">Log4Brains</a> and follow the getting started guide. You should be able to preview the site, and create your own ADRs</li>
</ul>
<h2 id="commit-and-push-to-azdevops">
Commit and push to AzDevOps
<a class="heading-link" href="#commit-and-push-to-azdevops">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>It can be <a href="https://docs.microsoft.com/en-us/azure/devops/repos/git/create-new-repo?view=azure-devops">a new repository</a> or the existing repository with the application. Before push ensure the output is being ignored via you .gitignore file:</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback"># Log4Brains Output
.Log4Brains/
</code></pre></div><h2 id="add-pipeline-yaml-file">
Add pipeline YAML File
<a class="heading-link" href="#add-pipeline-yaml-file">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>The location doesn&rsquo;t really matter. It can be at / , or you can have a Pipelines or similar folder, the location and name of the file doesn&rsquo;t really matter, what matters is the content:</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">trigger: none
pr: none
stages:
- stage: build
displayName: &#39;Build&#39;
jobs:
- job: build
displayName: &#39;Build&#39;
pool:
vmImage: &#39;ubuntu-latest&#39;
steps:
- task: NodeTool@0
inputs:
versionSpec: &#39;14.x&#39;
displayName: &#39;Install Node.js&#39;
- script: |
npm install -g log4brains
log4brains build
displayName: &#39;Install and Build Log4brains&#39;
- task: CopyFiles@2
displayName: &#39;Copy config to $(System.DefaultWorkingDirectory)/.log4brains/out&#39;
inputs:
Contents: staticwebapp.config.json
TargetFolder: &#39;$(System.DefaultWorkingDirectory)/.log4brains/out&#39;
- task: PublishPipelineArtifact@1
inputs:
targetPath: &#39;$(System.DefaultWorkingDirectory)/.log4brains/out&#39;
artifactName: staticSite
- stage: publish
displayName: &#39;Publish&#39;
dependsOn: build
jobs:
- deployment: deployProd
displayName: &#39;Deploy prod&#39;
environment: prod
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact@2
displayName: Download static site artifact
inputs:
artifact: staticSite
path: $(Build.SourcesDirectory)/staticSite
- task: AzureStaticWebApp@0
displayName: Upload to Azure Static WebApp
inputs:
app_location: /staticSite
output_location: &#34;&#34;
env:
azure_static_web_apps_api_token: $(deployment_token)
</code></pre></div><p>This is a multistage pipeline, where first we build the application, or in this case, run the Log4Brains static site generator to get the html/css output and make it available via Azure DevOps Artifacts, then the second stage pulls the artifact and publishes to Azure Static WebSites.</p>
<p>On the first stage we install node, install Log4Brains, run the build command to generate the output, then publish the output as an artifact. There is a CopyFiles task before the publish task, this task is to add a special config .json file to the output. This <a href="https://docs.microsoft.com/en-us/azure/static-web-apps/configuration#example-configuration-file">special file</a> is not needed if you want your ADRs to be publicly available, however if you want to take advantage of the <a href="https://docs.microsoft.com/en-us/azure/static-web-apps/authentication-authorization">Azure Static Website authorization and authentication</a> capability, you want to copy this file, which gets created in the next step.</p>
<p>The second stage pulls the artifact and calls the AzureStaticWebApp task to get the site published. The publish token is a pipeline variable that we will set later on the create pipeline step.</p>
<p>Git commit, git push, to make the file available in the repository.</p>
<h2 id="adding-the-static-web-page-configuration-file">
Adding the Static Web Page configuration file
<a class="heading-link" href="#adding-the-static-web-page-configuration-file">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>Optional, only if don&rsquo;t want your ADRs to be public. Azure Static Websites provide <a href="https://docs.microsoft.com/en-us/azure/static-web-apps/authentication-authorization">authorization and authentication capabilities</a> which are configured via the staticwebapp.config.json file, and then adding authorized users via the Azure Portal.</p>
<p>I created the staticwebapp.config.json file in /, and configured it:</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">{
&#34;routes&#34;: [
{
&#34;route&#34;: &#34;/*&#34;,
&#34;allowedRoles&#34;: [&#34;reader&#34;]
}],
&#34;responseOverrides&#34;: {
&#34;401&#34;: {
&#34;redirect&#34;: &#34;/.auth/login/aad&#34;,
&#34;statusCode&#34;: 302
},
&#34;404&#34;: {
&#34;rewrite&#34;: &#34;/404.html&#34;
}
}
}
</code></pre></div><p>What it says is that every route is only allowed to the &ldquo;reader&rdquo; role, and if a request comes from a not previously authenticated user, the request will be redirected to the Azure Active Directory authentication provider.</p>
<p>Git commit, git push, to make the file available in the repository.</p>
<h2 id="create-an-azure-static-website">
Create an Azure Static Website
<a class="heading-link" href="#create-an-azure-static-website">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>No surprises here, <a href="https://docs.microsoft.com/en-us/azure/static-web-apps/publish-devops#create-a-static-web-app">the steps on the portal</a> are very straight forward, and of course the CLI or a proper infrastructure as code pipeline will work as well.</p>
<h2 id="create-an-azure-devops-pipeline">
Create an Azure DevOps Pipeline
<a class="heading-link" href="#create-an-azure-devops-pipeline">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>Again, no surprises, <a href="https://docs.microsoft.com/en-us/azure/static-web-apps/publish-devops#create-the-pipeline-task-in-azure-devops">the steps on the guide are very clear</a>, except for steps 3 and 4, instead of the starter pipeline, select Existing Pipeline and point to the YAML File created earlier, then proceed to the next steps that set the variable with the token and run the pipeline.</p>
<p><img src="https://davidtriana.com/images/posts/2021/log4brains_pipeline.png" alt="&ldquo;Pipeline Success&rdquo;" title="Pipeline Success"></p>
<p>The artifacts link available in the pipeline output screen will allow you to verify the contents of the artifact, to ensure the index.html and the staticwebapp.config.json files are present, very useful for troubleshooting.</p>
<p><img src="https://davidtriana.com/images/posts/2021/log4brains_artifact.png" alt="&ldquo;Artifact details&rdquo;" title="Artifact details"></p>
<h2 id="invite-users">
Invite users
<a class="heading-link" href="#invite-users">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>Navigate to the Azure Portal, to the Static Web Resource, click on Role management to invite users</p>
<p><img src="https://davidtriana.com/images/posts/2021/log4brains_addrolesteps.png" alt="&ldquo;Role management&rdquo;" title="Role management"></p>
<p>In the invite users dialog, use &lsquo;reader&rsquo; as role. If you use any other role be sure to update the staticwebapp.config.json file accordingly.</p>
<p><img src="https://davidtriana.com/images/posts/2021/log4brains_addroledialog.png" alt="&ldquo;Role management&rdquo;" title="Role management"></p>
<h2 id="enjoy">
Enjoy!
<a class="heading-link" href="#enjoy">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>And with that, the site is available, with user authentication in place. The pipeline trigger can be updated so that every time a new ADR is checked in the pipeline runs automatically.</p></description></item><item><title>Visual Studio Code Remote</title><link>https://davidtriana.com/posts/2020/vscode-remote/</link><pubDate>Fri, 16 Oct 2020 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2020/vscode-remote/</guid><description><p>Visual Studio Code Remote enables scenarios where the editing happens on the client machine, but all the processing, build and dependencies happen somewhere else.</p>
<p>This is great for situations where the software being build requieres some specific setup that conflicts with your current machine or operating system, or to provide isolation between different projects or development environments, or to have access to more performant hardware or better internet connection, for the build, pulling dependencies, pushing containers, pulling docker images and many other scenarios where your local machine is not ideal.</p>
<p>A similar solution already exists, called <a href="https://azure.microsoft.com/en-us/services/visual-studio-online/">Visual Studio Codespaces</a>, which works pretty well but its being transitioned to a new home, in <a href="https://devblogs.microsoft.com/visualstudio/visual-studio-codespaces-is-consolidating-into-github-codespaces/">GitHub</a>, and at the time of this blog post it is still in private preview. A great discussion about it is available <a href="https://www.dotnetrocks.com/?show=1708">here</a>.</p>
<p>The specific scenarios for which this technology is a great time saver for me are:</p>
<h4 id="build-environments-isolation">
Build environments isolation
<a class="heading-link" href="#build-environments-isolation">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h4>
<p>Different projects have different needs, and often those needs collide, different version, different system configurations. With this technology I can have a specific remote for linux-phyton, linux-go, linux-dotnetcore, or even more specific, by version or project, having a trusted build environment for that specific need</p>
<h4 id="high-bandwidth-requirements">
High bandwidth requirements
<a class="heading-link" href="#high-bandwidth-requirements">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h4>
<p>Living outside the city has many adventages, but broadband is not one of them, so if my productivity depends on downloading several docker images, pushing containers, or having very good connectivity in general, then I&rsquo;m not productive, however with this technology all of that happens on the remote, where the bandwith is not a problem</p>
<h2 id="setting-it-up">
Setting it up
<a class="heading-link" href="#setting-it-up">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>While the steps to set this up are already documented <a href="https://code.visualstudio.com/docs/remote/ssh-tutorial">here</a>, this blog post goes over some of the details to make it work. For this, I created a new environment, on the client side I created a new Windows 10 Pro VM, and for the remote, a CentOS VM, both in Azure. Of course the idea of the client is to use your own computer, I created a new VM to see it work from scratch and being able to document all the steps.</p>
<p>I created the Windows VM from the portal in a new resource group called remotevs, for the Linux VM I used Azure CLI from the cloud shell in the portal:</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">az vm create --resource-group remotevs --name remotevsserver --image OpenLogic:CentOS:7.7:7.7.2020062400 --admin-username azureuser --generate-ssh-keys
</code></pre></div><p>Here I used the CentOS image just because I&rsquo;m studying for the Linux Foundation certification, any other image should work fine as happens with the one I use to build this blog which is based in Ubuntu.</p>
<p>Once the VMs were running I opened a remote desktop session on the Windows machine, and installed Visual Studio Code. Right after that I clicked on the link to get the SSH Extension, and got it installed on VSCode</p>
<p><a href="https://davidtriana.com/images/posts/2020/vscode-remote-extension.png"><img src="https://davidtriana.com/images/posts/2020/vscode-remote-extension.png" alt="VSCode Remote SSH Extension" title="VSCode Remote SSH Extension"></a></p>
<p>The documentation indicates that an SSH Client is needed on the client, and links to <a href="https://docs.microsoft.com/en-us/windows-server/administration/openssh/openssh_install_firstuse">OpenSSH for Windows</a>, which can be added via the user interface or via PowerShell, which I did</p>
<p><img src="https://davidtriana.com/images/posts/2020/admin-powershellpng.png" alt="&ldquo;Opening powershell as admin&rdquo;" title="Opening powershell as admin"></p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">Add-WindowsCapability -Online -Name OpenSSH.Client~~~~0.0.1.0
</code></pre></div><p>And now what is missing is the SSH Key. In the documentation, it is suggested to first create a key pair, and then use that public key during the virtual machine creation. That certainly works, the way I did it however, with the generate-ssh-keys parameter, means that the public and private keys are there in the cloud shell. I need to copy those keys to the client for the authentication to work.</p>
<p>To copy the files, I opened a cloud shell again, this time on the client machine, found the files, and then copy using the download command</p>
<p><img src="https://davidtriana.com/images/posts/2020/copyssh.png" alt="Copying files from cloud shell" title="Copying files from cloud shell"></p>
<p>And now move them in the client file system to the expected location</p>
<p><img src="https://davidtriana.com/images/posts/2020/copysshpowershell.png" alt="Moving files to ssh folder" title="Moving files to ssh folder"></p>
<p>The name id_rsa is a default name, and if this is your client machine you might already have one. Its fine to rename it, but the filename will need to be passed explicitly to the ssh command, and also be set explicitly in the VSCode configuration.</p>
<p>If the add-ssh command fails with</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">Error connecting to agent: No such file or directory
</code></pre></div><p>Its because the ssh-agent service is not running. To start it in PowerShell</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">Get-Service -Name ssh-agent | Set-Service -StartupType Manual
Start-Service ssh-agent
</code></pre></div><p>As described <a href="https://docs.microsoft.com/en-us/windows-server/administration/openssh/openssh_keymanagement">here</a></p>
<p>With the private key in place is time for a test</p>
<p><img src="https://davidtriana.com/images/posts/2020/sshtest.png" alt="SSH Connection Test" title="SSH Connection Test"></p>
<p>And now that I know my client is able to SSH to the remote, its time for VSCode to do the same</p>
<p><img src="https://davidtriana.com/images/posts/2020/codeconnect.png" alt="Connection dialog in VSCode" title="Connection dialog in VSCode"></p>
<p><img src="https://davidtriana.com/images/posts/2020/connecttestvscode.png" alt="VSCode and terminal connected" title="VSCode and terminal connected"></p>
<p>In case the private key file has a different name, as happens to me with the one for the blog, the connection will fail and the extension will allow to edit the configuration, which is a text file. On that text file, you can provide the path and name of the private key file, in my case for example:</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">Host hugoblog
HostName hugoblog
User azureuser
ForwardAgent yes
IdentityFile ~/.ssh/hugoblog
</code></pre></div><h1 id="next-steps">
Next steps
<a class="heading-link" href="#next-steps">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>Depending on the purpose of the remote you will need to apt-get update, configure git, install dotnet core, install python, install Azure CLI &hellip;</p>
<p>And something that works without explicit configuration is port forwarding, in the case of my blog I can browse it locally, as if it was running locally but in realilty is being port-forwarded from the remote. This works with dotnet core as well.</p>
<p>Have fun!</p></description></item><item><title>Getting the Shell back on WSL</title><link>https://davidtriana.com/posts/2020/getting-the-shell-back-on-wsl/</link><pubDate>Wed, 15 Jul 2020 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2020/getting-the-shell-back-on-wsl/</guid><description><p>I have been using Linux more and more, via WSL, and learning a lot along the way.</p>
<p>Part of the jorney is breaking and fixing things, which is exactly what happened to me today.</p>
<p>I was trying to get the very nice command prompt, like the one from <a href="https://www.hanselman.com/blog/HowToMakeAPrettyPromptInWindowsTerminalWithPowerlineNerdFontsCascadiaCodeWSLAndOhmyposh.aspx" title="Scott Hanselman's post about pretty prompts">Scott Hanselman</a>, and at some point following different steps from the ones shared by Scott I installed zsh as my shell.</p>
<p>After trying the zsh approach and not being able to make it work, I decided to revert, and stick to what Scott did, so first thing was removing zsh,</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">sudo apt-get remove zsh
</code></pre></div><p>And then when trying to get the prompt again all I got back was</p>
<p><a href="https://davidtriana.com/images/posts/2020/74581889-deaed800-4fef-11ea-9da4-af11872473e2.png"><img src="https://davidtriana.com/images/posts/2020/74581889-deaed800-4fef-11ea-9da4-af11872473e2.png" alt="process exited with code 1" title="process exited with code 1"></a></p>
<p>Very bad</p>
<p>A similar <a href="https://github.com/microsoft/WSL/issues/4899#issuecomment-658973402">GitHub issue</a> provided some tips, none of which fixed my problem, but <a href="https://github.com/microsoft/WSL/issues/4899#issuecomment-631976826">this particular comment</a> set me in the right path, by running</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">wsl --user root
</code></pre></div><p>With it I was able to get the bash prompt back, and from there try to fix whatever was broken. Since something was failing while starting the shell, after removing zsh, I looked into the .bashrc and .zshrc files, removed them, but no change.</p>
<p>Then a response to my comment on github cleared everything up, &ldquo;either apt install zsh or modify your users shell with chsh&rdquo;</p>
<p>So I was able to interact as root, since root was still using bash, but my user was still configured to use zsh, and since I removed it that&rsquo;s why it wasn&rsquo;t working, and to change the shell? That&rsquo;s where the <a href="https://www.computerhope.com/unix/chsh.htm">chsh command</a> comes to the rescue.</p>
<p>So running</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">chsh -s /bin/bash myLinuxUsername
</code></pre></div><p>Fixed the bash prompt</p>
<p>Now, back to customizing the prompt but without zsh, I revisited Scott&rsquo;s post, and then I needed to add some lines to the .bashrc file, but I deleted that one while troubleshooting earlier.</p>
<p>Luckily I found <a href="https://www.reddit.com/r/bashonubuntuonwindows/comments/5d9hhe/can_someone_please_paste_the_content_the_default/da3bv2i/">this reddit comment</a></p>
<p>And sure enough, the default .bashrc was available at /etc/skel so I just went there and did a</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">cp .bashrc ~/
</code></pre></div><p>And back on track to get my prompt as fancy as Scott&rsquo;s, and of course back at being productive with WSL.</p></description></item><item><title>Updating Azure DevOps WorktItems programmatically</title><link>https://davidtriana.com/posts/2020/updating-azdevops-workitems-programmatically/</link><pubDate>Mon, 18 May 2020 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2020/updating-azdevops-workitems-programmatically/</guid><description><h2 id="scenario">
Scenario
<a class="heading-link" href="#scenario">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>Some workitems are missing important information, the information was provided via an Excel sheet, the workitems need to be updated with this information without changing any other properties, particularly the workitem state.</p>
<h2 id="restrictions">
Restrictions
<a class="heading-link" href="#restrictions">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>The workitems are in done state, and the process template has rules that prevent changes to the properties that need to be updated.</p>
<p>The Excel integration, which might be a solution, is not working on this environment because of a permissions error.</p>
<h2 id="solution">
Solution
<a class="heading-link" href="#solution">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h2>
<p>Using the Azure DevOps .NET Client library, it is possible to update workitems. A relatively simple console application will do the trick, especially since the documentation already provides samples showing how to do it,</p>
<p><a href="https://docs.microsoft.com/en-us/azure/devops/integrate/quickstarts/work-item-quickstart?view=azure-devops">Fetch work items with queries programmatically in Azure DevOps Services</a>, and</p>
<p><a href="https://docs.microsoft.com/en-us/azure/devops/integrate/quickstarts/create-bug-quickstart?view=azure-devops">Create a bug in Azure DevOps Services using .NET client libraries</a>.</p>
<p>The issue with the process template rules remains. Those rules are enforced not only in the user interface but also using the client library or the API directly, which in the end are the same, so for this to work a permissions change is needed, for the particular identity that will be used to interact with AzDevOps via the client library.</p>
<p>The permission is called “Bypass rules on work item updates” and as off this writing can be found by navigating to Project Settings -&gt; Permissions -&gt; Users -&gt; Find the user that will be used and in the permissions panel find this setting.</p>
<p><a href="https://davidtriana.com/images/posts/2020/AzDevOpsScreenShot01.png"><img src="https://davidtriana.com/images/posts/2020/AzDevOpsScreenShot01.png" alt="AzDevOps permissions settings" title="AzDevOps permissions settings"></a></p>
<p>Once the user has this permission, he/she can certainly do these changes directly via the user interface, however doing repetitive tasks by hand is something to avoid.</p>
<p>Or not…</p>
<p><a href="https://twitter.com/kvlly/status/1255173275090640897?s=20"><img src="https://davidtriana.com/images/posts/2020/Tweet01.png" alt="AzDevOps permissions settings" title="AzDevOps permissions settings"></a></p>
<p>Certainly I have done this many times, writing a huge Excel VB Macro to move things around or implementing a GitHub action just to copy a file, but I think in the end its worth it, primarily because manual tasks are error prone, but also because of the joy of programming!</p>
<p>So, back to the point,</p>
<p>With the bypass rules permission, there is something else needed from AzDevOps, and it is a personal access token, which provides impersonation, authentication and authorization. With this token the code will be able to interact with AzDevOps, using the identity of the user for which the token was generated.</p>
<p>This personal access tokens have a lot of very nice security features which makes it a very good option for many scenarios, in particular the permissions can be scoped to specific artifacts and actions, and are set to expire in 30 days by default.</p>
<p>At the time of writing this post, the option to create a new token is available in the top right, over the user settings icon,</p>
<p><a href="https://davidtriana.com/images/posts/2020/AzDevOpsScreenShot02.png"><img src="https://davidtriana.com/images/posts/2020/AzDevOpsScreenShot02.png" alt="AzDevOps personal token button" title="AzDevOps personal token button"></a></p>
<p>And then the token details screen looks like this,</p>
<p><a href="https://davidtriana.com/images/posts/2020/AzDevOpsScreenShot03.png"><img src="https://davidtriana.com/images/posts/2020/AzDevOpsScreenShot03.png" alt="AzDevOps personal token details" title="AzDevOps personal token details"></a></p>
<p>With the token in hand, we can now get to the code, which is an adaptation from the examples linked earlier.</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">using System;
using System.Collections.Generic;
using System.IO;
using System.Threading.Tasks;
using Microsoft.TeamFoundation.WorkItemTracking.WebApi;
using Microsoft.VisualStudio.Services.Common;
using Microsoft.VisualStudio.Services.WebApi;
using Microsoft.VisualStudio.Services.WebApi.Patch;
using Microsoft.VisualStudio.Services.WebApi.Patch.Json;
using Newtonsoft.Json;
namespace ConsoleApp
{
class Program
{
const string AzDevOpsUri = &#34;https://dev.azure.com/&lt;your_org&gt;&#34;;
const string PersonalAccessToken = &#34;&lt;the_token_generated_earlier&gt;&#34;;
const string ProjectName = &#34;&lt;the_name_of_your_project&gt;&#34;;
static void Main()
{
var connection = new VssConnection(new Uri(AzDevOpsUri), new VssBasicCredential(string.Empty, PersonalAccessToken));
var items = LoadJson();
UpdateWorkItems(connection, items).Wait();
Console.ReadKey();
}
static private async Task UpdateWorkItems(VssConnection connection, IEnumerable&lt;Item&gt; items)
{
WorkItemTrackingHttpClient witClient = connection.GetClient&lt;WorkItemTrackingHttpClient&gt;();
foreach (var item in items)
{
JsonPatchDocument documentWithTheUpdate = new JsonPatchDocument
{
new JsonPatchOperation()
{
Operation = Operation.Replace,
Path = &#34;/fields/Microsoft.VSTS.Scheduling.Effort&#34;,
Value = item.Total
}
};
try
{
var result = await witClient.UpdateWorkItemAsync(documentWithTheUpdate, ProjectName, item.ID, false,
true);
Console.WriteLine($&#34;Workitem {result.Id} updated to revision {result.Rev}&#34;);
}
catch (AggregateException ex)
{
Console.WriteLine($&#34;Error updating workitem {item.ID}: {ex.InnerException?.Message}&#34;);
}
}
}
public static List&lt;Item&gt; LoadJson()
{
using (StreamReader r = new StreamReader(&#34;values.json&#34;))
{
string json = r.ReadToEnd();
return JsonConvert.DeserializeObject&lt;List&lt;Item&gt;&gt;(json);
}
}
public class Item
{
public int ID;
public string Title;
public double Dev;
public double Test;
public double Total;
}
}
}
</code></pre></div><p>At some point I decided to convert the Excel file to JSON, that is why the code has this LoadJson method. Reading the Excel file specially saved as comma separated values (CSV) should not have been much different except for this method.</p>
<p>My JSON input file looks like this:</p>
<div class="highlight"><pre tabindex="0" style="background-color:#fff;-moz-tab-size:4;-o-tab-size:4;tab-size:4"><code class="language-fallback" data-lang="fallback">[
{
&#34;ID&#34;: &#34;25071&#34;,
&#34;Dev Effort&#34;: &#34;1&#34;,
&#34;Test Effort&#34;: &#34;0.5&#34;,
&#34;Effort&#34;: &#34;1.5&#34;
},
{
&#34;ID&#34;: &#34;44260&#34;,
&#34;Dev Effort&#34;: &#34;2&#34;,
&#34;Test Effort&#34;: &#34;0.5&#34;,
&#34;Effort&#34;: &#34;2.5&#34;
},
{
&#34;ID&#34;: &#34;43303&#34;,
&#34;Dev Effort&#34;: &#34;4&#34;,
&#34;Test Effort&#34;: &#34;1&#34;,
&#34;Effort&#34;: &#34;5&#34;
}, ...
</code></pre></div><p>And running the program with proper configuration and having the JSON file in the right place produce the sample output bellow</p>
<p><a href="https://davidtriana.com/images/posts/2020/AzDevOpsScreenShot04.png"><img src="https://davidtriana.com/images/posts/2020/AzDevOpsScreenShot04.png" alt="Program Output" title="Program Output"></a></p>
<p>And mission accomplished, now this WorkItems have the missing effort populated, without changing the “Done” state date.</p></description></item><item><title>Password and two factor authentication in the web</title><link>https://davidtriana.com/posts/2020/password-and-two-factor-authentication-in-the-web/</link><pubDate>Mon, 27 Apr 2020 00:00:00 +0000</pubDate><author>david@davidtriana.com (David Triana)</author><guid>https://davidtriana.com/posts/2020/password-and-two-factor-authentication-in-the-web/</guid><description><p>A few weeks ago a friend called me, very worried, about a threatening email he is receiving, someone claiming to have access to his email, threatening to release some compromising information, sharing part of his email password as proof, and asking for a ransom to be paid via Bitcoin.
The password was in fact a password he used in the past, or maybe for a different site, not his current email password, but still he was very worried on how this happened.</p>
<p>I told him not to worry about the email, its just spam, but also provided some advice and explanations on how this happened.</p>
<h1 id="how-this-happened">
How this happened?
<a class="heading-link" href="#how-this-happened">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>
</h1>
<p>When you register to gain access to a website you typically provide your email and a password. How well the website stores and protect that information is up to the website, and often not very well done, and even in cases where it is been done correctly <a href="https://edition.cnn.com/2018/11/30/tech/marriott-hotels-hacked/index.html" title="CNN's Article about stolen passwords from Starwood hotels now owned by Marriot">it could get leaked or stolen</a>.</p>
<p>Lists of those emails and password are available, <a href="https://krebsonsecurity.com/2017/12/the-market-for-stolen-account-credentials/" title="Blob post about the web credentials dark market">sometimes for sale</a> sometimes for free, and with that information people with no so good intentions will try to gain access to your email, bank, Facebook, and every other Internet service which might give them some direct or indirect gain. The email my friend received was put together using this same data.</p>
<h1 id="how-can-i-protect-myself-from-this">
How can I protect myself from this?
<a class="heading-link" href="#how-can-i-protect-myself-from-this">
<i class="fa fa-link" aria-hidden="true" title="Link to heading"></i>
<span class="sr-only">Link to heading</span>
</a>