Thursday, 25 January 2018

Powerscript - Processing Changed Files in a Folder


I needed to find a way of running an import routine for all CSV files in a folder that have recently changed. To do this I created a Windows Powerscript that looked for files where the archive attribute had been set.

(nb. The archive attribute is always set when a file is created or updated)

The basic process will be:-
  • Get all files in folder (which have the .csv extension).
  • Loop through looking for those with the archive attribute set.
  • If found start a new log.
  • Run my dataload process.
  • If success remove archive bit.
  • Write output to log.
The structure and layout of Powerscript is reminiscent of Perl, the block structures are normal, but comparison operators can catch the unfamiliar out.

The Powerscript

Here's what I came up with:-

$path = "<path_to_files>"
$files = Get-ChildItem -Path $path -filter "*.csv"
$attribute = [io.fileattributes]::archive
$newlog = 0

Foreach($file in $files) {
 If((Get-ItemProperty -Path $file.fullname).attributes -band $attribute) {
   if ($newlog -eq 0) {
    $LogTime = Get-Date -Format "dd/MM/yyyy hh:mm:ss"
    "Processing started: $LogTime" | Out-File import.log
    $newlog = 1
   "File: $file" | Out-File run_rt_forecast.log -Append
   $scriptOutput = &<my_external_process> $file.fullname 2>&1
   if($?) {
    Set-ItemProperty -Path $file.fullname -Name attributes -Value ((Get-ItemProperty $file.fullname).attributes -BXOR $attribute)
   Foreach($out in $scriptOutput) {
    "$out" | Out-File run_rt_forecast.log -Append

Having spent a lot of time working with Perl, I found it okay to use. There's quite a nice Windows PowerShell ISE which does a reasonable job of allowing you to develop and test in one place. I was also able to open my log file here, but it's a shame it doesn't offer to reload the file when it detects a change.


Just a few things that caught me out..
  • Ensure you give your script name an extension of ".ps1".
  • When testing, remember to use a Powershell not an ordinary Command Window.

Monday, 4 December 2017

Secret Santa - Without a Facilitator Using QR Codes


We decided to introduce our younger kids to joy of giving gifts this Christmas by organising a Secret Santa. But we all want to have the fun of guessing who the secret santa is, so decided to find a way that didn't use a facilitator. In addition to that, two people had to be included over Facetime, giving them the same experience, and preserving their secracy.

There are ways to do this using websites and apps, but I wanted to be able to do this with paper, drawing evelopes from a box, and so I could supply a generic label for everyone to use.

The Problem

There are basically two rules in this endevour:
  • ensure you don't pick yourself.
  • maintain secracy at all times.
I needed a way that envelopes could be uniquely identified, but not without some work. They should all look the same to the naked eye so nobody should be able to see when their envelope has been drawn, yet the drawer should be able to effectively reject their own.

Thinking back to my post on treasure hunts the other year, this seemed like a job for QR Codes!

Printing Codes

Find yourself a good QR code generator, (I used, but you can also create them in QR Reader Apps on your phone.

1 - Start out by creating a load of codes based on simple numbers, to fix to your envelopes. Print them out at a size of around 1 inch square.

Envelope Identifiers (1-7)

2 - Then create a set of labels with the recipients name and any other details you'd like to add. (I added their Christmas Elf name details from )

Label for my son Saul

Finally I printed off a note for each person. You don't need a QR code on this, and it could just be a bit of paper with their name.

Saul's Note & Instructions

OK, now you're ready to start.

Running the Selection Process

Follow these instructions:-
  1. Fix the envelope identifier QR codes to the outside of your envelopes in a way that they all appear to be the same. (note the orientation of the corner blocks). Use a glue stick so that the label may be removed near the end.
  2. Give each person their own note and label, and a randomly selected envelope.
  3. Each person should then seal these inside their envelope. 
  4. Everyone should then scan their envelope code with their phone to find out their number/id and then post the envelope into a box.
  5. Shake the box to mix up the envelopes and have each person draw one in turn, while others wait at the far end of the room.
  6. After selecting they should scan the QR code and ensure it doesnt belong to them. (if so, have them replace it and take another)
  7. Peel off the QR code and place it in the box, then they should open the envelope in secret.
(Repeat 5-7 for all)

nb. Those joining on Facetime will need an assistant in the room to hold their envelope up to the camera so the remote person can scan it. After drawing the envelope (in step 5) the assistant will also have to write their name and either post it, or find a way of getting it to them.

Hope you find this useful, have a nice holiday.

Monday, 11 September 2017

iPad Won't Start or Charge

The Problem

My son let his ipad run down really low, and then his brother accidentally unplugged it shortly after it had been set to charge. The net result was that the screen went black, then when we plugged it back in to charge we just got the apple logo.

After a few hours it was still the same. It didn't appear to be charging and refused to start.

The solution

I fixed it by putting the ipad into DFU mode and left it plugged into the charger overnight.
  1. Hold the Power Button (3 secs)
  2. Continue holding the power button and also hold the home button (15 secs)
  3. Release the power button while continuing to hold the home button (10 secs)
The ipad screen was black after this, but you'll know it's in DFU mode because when you plug it in you will no longer get the apple logo.

The next day I forced it to shutdown by pressing Power and Home. You should then be able to start it as normal and it will show a 100% charge.

Hopefully this will work for you, good luck.

Tuesday, 18 July 2017

VB - Deploying an Application using Oracle Data Access


It's great when you get your application to run in Visual Studio, or even from the compiled files on your PC, but at some stage you're going to want to deploy or share it. This is where I ran into problems with Oracle drivers.

Oracle  Drivers

You set up your Oracle access in VB by selecting the DLL that came with your client software and adding it to the project resources. Here's what mine looked like in Visual Studio..

Assigning the Oracle library as a project resource.
Then add the following imports into your program module..

Imports Oracle.DataAccess.Client
Imports Oracle.DataAccess.Types

But, although you've set your resources correctly, the DLL location and version is likely to be different on other machines you deploy to. So you'll probably to get errors.

You can start to fix this by setting the Copy Local value to be True, and then the DLL file gets bundled into the release folder when you re-compile.

Set Copy Local
But while this might get your locally compiled version working, it's likely that other machines will still have problems with driver compatibility errors.

Typically you'll see something like this..

Oracle.DataAccess.Client.OracleConnection' threw an exception. ---> Oracle.DataAccess.Client.OracleException: The provider is not compatible with the version of Oracle client


Download the Instant Client Basic Lite files from Oracle and unzip it to your PC. Copy the following library DLL files and put it into the same deploy folder as your executable:-


Hopefully that will get you going.

Friday, 7 July 2017

VB - Connecting to an Oracle Database without using a TNS Entry


Connecting to an Oracle database isn't too much of a problem, there's examples everywhere on the web along the following lines..

Dim conn As New OracleConnection()
Dim connstr as String, dataSource as String, userId as String, password as String
dataSource = "dev10g"
userId = "jsmith"
password = "letmein"
connstr = "Data Source=" + dataSource + "; User Id=" + userId + "; Password=" + password + ";"
conn.ConnectionString = connstr
Catch ex As Exception
  ' Database connection failed
  conn.Dispose() 'Dispose of the connection
  Exit Sub
End Try

This works ok, but it's not very portable unless everyone that wants to use it sets themselves a TNS entry called "dev10g" in their tnsnames.ora file.

What if we wanted to define the host and port number in the config, and then connect without using TNS?

Connecting to Oracle Directly

In practice all we need to do is alter the connection string to provide the information that the tnsnames.ora would have sent. So the code above doesnt change much.

Dim conn As New OracleConnection()
Dim connstr As String
Dim dbServer As String, dbPort As String, dbServiceName As String
Dim userId As String, password As String

dbServer = "lordv01"
dbPort = "1521"
dbServiceName = "dev10g"
userId = "jsmith"
password = "letmein"
dconnstr = "Data Source=" + dbServer + ":" + dbPort + "/" + dbServiceName + ";User ID=" + userId + ";Password=" + password
conn.ConnectionString = connstr
  Catch ex As OracleException
  conn.Dispose() 'Dispose of the connection
  Exit Sub
End Try

It took me a bit of poking around to find the right syntax, so hopefully someone will find this useful.

Friday, 30 June 2017

VB - Reading a Text Resource File Line by Line


I'm not a VB programmer, but recently I needed to build an interface between Oracle and an MDB file, so I had to start getting familiar with Visual Express Studio 2013. I've done a lot of LotusScript in the past so it wasn't a big problem, but some things have proved to be a struggle.

For today's problem, I wanted to store a text file (used as a template) and pragmatically create a new version of it with a new name. It felt like the best thing to do was store the file as a resource, but it wasn't immediately obvious how to do it.

Creating the Resource

You can add a resource to your project as a text file using the Resources section of your project Properties. It's quite simple to access it in your code, in the following way:-

myString = My.Resource.MyTextFile 

But, if you do this then you'll get the whole file as a long text string. I wanted to be able to read it in using StreamReader and write it out line by line. But text files are always treated in this special way, so my method won't work.

The answer is to trick VB into thinking it's opening a binary file (just give it a different extension).

I created my file called inprep.template and moved it into the projects resource folder.

I then dragged it into the Resources page to register it as a project resource.

My new file resource

The VB Code

Here's the code to read the file (one row at a time) and output it to a new file.

Dim template As New MemoryStream(My.Resources.inprep)
Dim file As System.IO.StreamWriter
Dim oRdr As StreamReader = New StreamReader(template)
   file = My.Computer.FileSystem.OpenTextFileWriter("c:\test.txt", False)
   Do While oRdr.Peek() >= 0
Catch ex As Exception
End Try

It's simple enough once you realise that you need to handle it as a binary file.

Friday, 9 December 2016

A Simple Guide to Oracle Data Pump


In order to create a test system for one of our customers I needed to copy some schemas from the live oracle database. As the main schema contained a lot of objects, I realised that the best thing to use was Data Pump. Something I'd never used it before.

Oracle Data Pump first appeared in 10g and provided a newer and more flexible alternative to the 'exp' and 'imp' utilities from previous Oracle versions. It's a server based technology so you'll need access to the file directory structure on the database server. That means you'll need to be able to remote connect to both source and target servers.

Running an Export

We're going to start by running a schema export, it's quite straight forward, but we need to ensure that we have a directory object configured in Oracle. You could go ahead and just add one, but it's worth looking to see if there's one already set up that you can use. Log into the source server (in my case Live) and type the following:-

SQL> select directory_name, directory_path from all_directories;

The results should give you a column listing the object name, and a second stating it's actual directory path on disk. It makes sense to choose one where the directory actually exists, and where you have file creations rights, but that should go without saying!

If nothing suitable exists then go ahead and create one, and then grant yourself read and write on it.

You run the export from the operating systems command prompt, here's what I used:-

expdp <my user>/<my password> directory=<my directory object> dumpfile=<my export file>.dmp schemas=<schema to export>

When it runs it will output process to the screen and may take a number of minutes to complete (depending on number of objects and size of tables). If you'd prefer the progress can be sent to a file just by including the following paramater:-

  logfile=<export log>.log

Now if you go to the directory listed in the directory_path you should see your DMP file waiting for you. They can be quite large but normally they zip quite well to make file transfer quicker.

Running an Import

The obvious next step was to copy the dump file over to the target server (in my case the new test system), but don't worry about where to put it just yet. What we need to do again is find an Oracle directory object to use for the import process.

I used the same query as before:-

SQL> select directory_name, directory_path from all_directories;

If a directory object exists then move your dump file into it, or (as before) create your own directory object in Oracle.

Before running the import we need to connect as sysdba and create the empty schema in the test system.

SQL> create user newschema identified by pwd4newschema;

(NB. The "Identified By" parameter is the password.)

Finally at the command prompt run the import command:-

impdp <my user>/<my password> DIRECTORY=<my directory object> DUMPFILE=<my export file>.dmp

Again the progress will be reported to the screen. Scan through it and check you don't get any errors. I had some dependancy issues because two other schemas referenced in the packaged functions where missing. If this happens to you, copy the missing references and recompile your packages.