Laravel Localization

Sunday 18 December 2016

I decided to try to translate this site into French, given that I live in the French-speaking part of Switzerland. Laravel has a lot of great tools for localization built-in, but there were a few things sorely lacking. Laravel, by default, has localization files in /resources/lang/en. Each file is just an array with a key and the translated text as the value. If you want to add a new language you just copy the files over into a new directory, in this case /fr, and translate the text directly in there. In the views instead of typing in the text directly you call trans('file.key') and it pulls the text for 'key' out of the 'file.php' in the appropriate language directory. This couldn't be any easier.

The hard part was when I started trying to figure out how to set the language to be displayed automatically. Laravel pulls this value from config/app.php, and you can change this value easily, but because Laravel is RESTful it has to be done on every request. So I decided to stick the actual language in a session variable and then change the value in the config array if needed.

I tried to make a middleware to do this on each request, but this didn't work because it seemed as if the session either wasn't saving from middleware, or possibly wasn't initialized yet in the middleware. More on this below. So I abandoned the middleware route and added a function in my controller that sets the language thusly:

App::setLocale( session('locale') ? session('locale') : config('app.locale'));

This worked fine, I just need to make sure to call the function everytime a page may need to be translated. My next step was to try to add a subdomain 'fr.' that would automatically set the language to French. You can do this in the web.php routes file, but from what I can tell it needs to be called on every single route, which seemed like an awful lot of work for something that should be pretty easy. 

So I went back to the middleware and created a middleware called SetLanguage that I added to app\Http\Kernel.php so it runs on every request. The middleware is quite simply this:

$pieces = explode('.', $request->getHost());
if($pieces[0] == 'fr'){
    session(['locale' => 'fr']);
}
return $next($request);

And this works fine. I think the problems I had with the middleware the first time I tried it was that Laravel has changed how sessions are handled in new releases, and the Session facade no longer works or works differently. Instead you now use the session() helper function or call $request->session() to modify the session. I had been trying to use the Session facade.

One thing that seems a bit odd is that since the middleware runs on every request it should translate every page that is called from the fr. subdomain to French. In actuallity it initially sets the language to French, but if you change the language using the drop-down menu it keeps the selection. This doesn't seem right, but in this case the actual behavior makes more sense than the expected behavior, so I am ignoring this bug.

Once I got this figured out, the translation was a simple matter of replacing text in my views with references to the lang files, which went smoothly, although I did have to spend some time trying to figure out how to translate some technical terms into French, and still am not sure I have them all translated properly.  

Labels: coding, laravel, localization
No comments

Database Change to My Record Collection

Wednesday 14 December 2016

In my records table I used to have the full text of the label in each row, but I ended up with small typos in label names that ended up screwing things up. So I separated the labels out into their own table and added a foreign key to link the two tables. In my records admin instead of having a text field for the label I used a drop-down that was populated from the labels table. But this caused more problems than it solved, because it greatly complicated the code for searching records, updating records, and I had to write extra methods to add new labels to the labels table.

So I decided to get rid of the extra table and just put the full text back into the records table. I kept the drop-down menu in the admin section, but populated it with data pulled from the records table to eliminate my original problem. To my Record model I added a public static function that selects the distinct labels with the text as the key and the value to pass to the drop-down menu, and then was able to greatly simplify a lot of my code. 

I just pushed this up a few minutes ago, and so far haven't found any problems, but that doesn't mean they aren't there. If anyone finds any errors in the Record Collection page or the API please let me know.

As a side note, having PHPUnit tests for everything already set up made this a whole lot easier. I didn't have to go through and check every possible combination of things that could be searched or test adding and editing and all of that because I already had the tests written. I used to rely on a QA department to regression test everything, but PHPUnit makes all of that much easier and instead of waiting days to have QA people check everything I can do it in under a minute myself.

Labels: coding, music
No comments

Backing Up Data to Amazon S3

Tuesday 13 December 2016

I decided I wanted to backup my database somewhere other than my server. All of my code is in git so the only thing that could be lost in case of server errors is the database. To start I wrote a little shell script to dump the database using mysqldump. I wasn't sure where to put the SQL file to keep it off-server. My first thought was to put it in git, which was easy to do in the script. So I updated the script to add the file to git, commit the changes and push the repo up.

After a bit more thought I decided that might not be the best way to do it. It worked fine, but my usual workflow is I make all changes locally and then push it to git and then pull to production - I don't make any changes on the production server unless absolutely necessary. While adding files to git that don't exist in my dev environment shouldn't really cause any problems, I thought there must be a better way.

So I decided to put the dump file into an Amazon S3 bucket. Laravel can use S3 as a filesystem, as documented here, but I had tried to use this before and not had much luck. I saw that Amazon had a PHP package to interact with S3, which is SDKforPHP, so I thought I would try that out. After a little bit more digging I found that Amazon also has a package specifically for Laravel, which is located here. That turned out to be the winner. As opposed to trying to read pages of documentation for Laravel's file system or the Amazon SDK, all I need was a few lines of code and I was up and running. As a note, this package keeps the Amazon Keys and Secrets in the .env file, which is a lot better than keeping them in the filesystem.php config file like Laravel does. If you are going to use Laravel's S3 filesystem I suggest you update the filesystem.php file to pull them from the .env.

Now that I was able to upload files to S3 from a browser, the next step was to create an artisan command that I could add to my shell script. The Laravel documentation for this was clear and easy to follow. The only problem I had was a typo that for some reason didn't throw an error locally, but did on my production server. Other than that this is tested and working.

I had considered using S3 for this site in the past, but decided not to since I had problems with the Laravel S3 filesystem. Now that I've integrated with S3 so easily I may revisit that decision.

 

Labels: coding
No comments

Redis

Sunday 11 December 2016

I've been messing around with Redis for a little while now and I'm using it in a couple places on this site. The first thing I did was I started caching some DB queries that get performed a lot, like the main blog page, the blog archives menu and the list of recent posts on the home page. Laravel's Cache facade makes caching really easy, and you can switch between the default cache driver which caches to files and Redis without changing any of the code. For some pages I use the Laravel Cache facade, for others I use the Redis facade to cache directly to Redis just for some variety. In general it is probably much better to use the Cache facade than the Redis facade because if you want to switch to a different caching mechanism with one change to the .env file instead of having to rewrite all of the code.

I also use Redis to queue some tasks which don't need to be done synchronously, but that's another post.

I never used Memcache because I didn't like the fact that it's all stored in memory, so if the server goes down you would lose all of the data in it, but Redis persists data to disk by default so it provides the speed of keeping data in memory with a very low risk of losing the data. In my case, I store the data in the DB and cache it in Redis, but if I were to start from scratch (and had plenty of RAM on my server) I would probably keep a lot more data in Redis.

So, to summarize, I think Redis is awesome and I will definitely make more use of it in my stack in the future.

Labels: coding
No comments

Update on Pulling Data from Discogs

Thursday 01 December 2016

I initially ran into problems pulling the data from discogs because I was using a package that provided a Laravel implementation of cURL called Laracurl, and it didn't provide a header that discogs needed. So I made a change to the package and got it working. After someone advised me that Guzzle was a better package than cURL I switched to that, and in the process rewrote my matching code. After running the new, improved, streamlined code I have now matched all but 250 of my records to the discogs data, and 50 of those unmatched records are white labels which may not be matchable. So I am pretty happy with that ratio and will be moving on to my next project now.

Almost all of my records now have a link to discogs and a thumbnail pulled from discogs in the record detail page. If anyone wants to see additional info on the records such as tracklisting, year of release, or anything else, that info will be available on discogs. I didn't see any need to duplicate that data locally.

The code I wrote to automatically match my records to discogs did not match everything 100% accurately, and I tried to review the matches to make sure they were accurate, but I may have missed a couple here and there. 

Thanks to Discogs.com for making a great API. 

Labels: coding, music
No comments