Jump to content
FunTimeBliss ::

All Activity

This stream auto-updates     

  1. Earlier
  2. Oh good, I did reference the XMP timing issue with the CMU32GX4M2C3000C15 Vengeance LED RAM running on an Asus Hero IX. Funny thing. I got new faster memory and it works in XMP fine. Even better, is the same 3000 MHz RAM in another Asus motherboard of a different model, also works without issue. As the bump from 2 years ago implies, I also had been seeing memtest errors for some time. Turns out they were on test 7. That is not a memory issue but errors related to random value generation. That bug goes back to 2012, so let me save you some sanity checks and crazy debugging. I figured this out, a few weeks ago. hahaha
  3. Quoting myself with a post from Overclockers.com.au: In addition, 1.39 beta has given better hand tracking with the controller range. I was firing off bow and arrow shots like a beast last night. Also for the issue about height. There are several games that have an option of standing or seated mode. This looks to directly handle the issue with being very close to the floor, once you set your height to sitting, while sitting. I'll link to an Oculus forum thread about height and accessibility I commented in too. Highlights being, VRChat has height adjust via seated or standing mode while Vader Immortal does not have seated mode.
  4. Been a bit, let's cook this up cleaner. ffmpeg -i "thugcrowd_s2e4_origAud.mp4" -vn -acodec copy "thugcrowd_s2e4_origAud.aac" Shorthand based off https://gist.github.com/protrolium/e0dbd4bb0f1a396fcb55 extract the audio from source file as single audio stream, without video.
  5. Below is the process I used to calculate playtime to map chapters. I am no math expert πŸ˜› Sample for encoding with subtitles. Note I make a custom metadata file for each episode. ffmpeg -fflags +genpts -i 050_BlueTeamVillage.avi -i metadTc050 -map_metadata 0 -codec copy 050_BlueTeamVillage_01.mp4 For avi, I export as .mp4 as .avi does not like chapters very much. As noted, chapter math below: 4680000 / 60000 = 78 (minutes) So, reversing that from playtime, 78 * 60000 to get our value as defined in the metadata file for chapter indexes.
  6. If you catch a desire to burn an ISO to a usbDrive you can also use dd to image your confirmed USB storage device, as you confirmed in gparted, fdisk or like paritions and disks. dd if=isoToBurn.iso of=/dev/[sd?] status=progress
  7. ffmpeg -fflags +genpts -i 049_Hacks4Pancakes.avi -i metadata -map_metadata 0 -codec copy 049_Hacks4Pancakes_01.mp4 The revised sample resolves the 'Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly' you are likely to see when adding the Subtitle files. Let us talk about extensions. Video is most compliant with chapters if you output to .mp4 To audio, mp3 will show chapters in some but not all players. .m4a / mp4 output is consistent with chapter recognition and playback to timings for chapter. My testing output to avi, was outright no chapter functionality in tests. Thank this old ticket
  8. Pic0o


    Hola and may your current be well. I moved forums around to clean up presentation. A little better wording for sub-forums too. I keep forgetting I need to edit the css to remove the silly page limitation for printing content out. I'm a fan of PDF dump compliant content. Granted there is still a lite theme that gets invoked by your browser user agent, I believe. Site does still work on Lynx πŸ™‚
  9. Pic0o


    Missing out on an old staple to gripe about things like Virtualization being off default each bios firmware update version increase flash. Thank you for this stroll, I know there is plenty of old babble here through the years. I keep on that though just less a so here
  10. Since F8 keynote has come and gone, there are release windows for the Rift S and Oculus Quest Headsets for May 21st 2019. As reviews of the Rift S trickle out there are some pros and cons with the new headset, with the main takeaway being, not really worth it to bump from a CV1 headset to Rift S, but there is some improvement in the image quality, while also having a lower refresh rate of 80 Hz on the Rift S, versus 90 on the CV1 headset. The Rift S has sensors built into the headset, but as I have a 3 sensor setup with my CV1, I also find another case where the jump is not really useful in my case. Audio is also lower quality due to the change in speakers. No more over the ear pieces like on your Rift CV1. No IPD adjustment on Rift S either. It goes with a default eye width for all. Meanwhile, as I never got the Oculus Go, The Oculus Quest is looking quite nice for a stand-alone headset with it's pretty appealing. This being said as I have taken a PC desktop and Rift CV1 on a LAN Trip for the day. It made for a great demo but taking all that equipment was not fun. I am tempted to dabble in the stand alone headset experience too. There will also be a new Steam / Valve Index Headset releasing this year. Granted that will be much more costly around $1000, as the Rift S will go for $400 and the Oculus Quest with 128 GB Storage will go for $500, with the 64 GB Storage unit being $400. Please note, these are my collected observations from others. I only have the CV1 currently πŸ™‚
  11. Total stub thread. Been working on adding chapter content to some files. https://ffmpeg.org/ffmpeg-formats.html#Metadata-1 ffmpeg -i INPUT -i FFMETADATAFILE -map_metadata 1 -codec copy OUTPUT https://medium.com/@dathanbennett/adding-chapters-to-an-mp4-file-using-ffmpeg-5e43df269687 Testing: MP3 will work but VLC will not inherently show you the chapters. If you encode to mp4, it will. Other Audio Players do show chapters for an MP3 just fine though. ffmpeg -i 049_Hacks4Pancakes.mp3 -i metadata -map_metadata 1 -codec copy 049_Hacks4Pancakes00.mp3 note that the metadata before -map_metadata is my filename with the [Chapters] content. The math is a little silly, but each minute is represented by 60000. I used a spreadsheet in my case to come up with numbers.
  12. I have been all over the place lately, mostly mentally. There was a new DLC pack with about 18 levels that came out within a month for $5. I have played 17 of them and just have 2 left to finish. I loved the new maps as well! It totally caught me off guard that this game got extra content and I purchased the extra content as soon as it became an option. I kind of would like to have this game as digital copy too, since I play it pretty often and it would save me from inserting a game card. On the same lines though, having a cart makes it easy to let people borrow and enjoy the game too :)
  13. Bumping this as it seems the 2nd Gen headsets will either release @ F8 Keynote or at least a release date will be given. With head-mounted tracking instead of 3rd party sensors and the upgraded LCDs in the Rift S, I am quite interested in getting my hands on this headset. The Oculus Quest looks cool too, since it will have 6-DoF versus the 3-DoF (pitch and movement sensory improvements for hand tracking, instead of just head tilting. and also upgraded screens versus the Oculus Go. I may have some feedback here in a few weeks since I already know a good home for my CV1.
  14. Here is a thread to share any notable changes or concerns with Windows 10 or recent server builds. I start the thread with 10.0.17763.437 version of Windows 10. Two things jump out upon installing patches: β€œThis app is no longer available.” message comes up about CPUID CPU-Z. I had version 1.80.0 installed. Latest is version 1.88.0 and updating should have resolved that. Still, the app removal feature of Windows 10 seems to be quite old based off the 2015 date for the build listed in the WindowsClub link from Version 1511. In my Taskbar applications, one of my pinned files said it was missing. Browsing to the path opened the file fine. To resolve, I removed the existing pinned filename, then I just dragged the icon for the file down to the taskbar shortcut and had a 'Pin to AppName' option that applied when I let go of the mouse button. For point of reference I wanted to confirm what recent updates installed on my PC. We could do it via Add/Remove programs and view them by date but we out here trying to notate this, so let's run some PowerShell. Get-Hotfix Running this will give similar formatted results. Source Description HotFixID InstalledBy InstalledOn ------ ----------- -------- ----------- ----------- COMPYX86 Update KB4483452 NT AUTHORITY\SYSTEM 4/13/2019 12:00:00 AM COMPYX86 Update KB4462930 NT AUTHORITY\SYSTEM 4/13/2019 12:00:00 AM COMPYX86 Security Update KB4493478 NT AUTHORITY\SYSTEM 4/13/2019 12:00:00 AM COMPYX86 Security Update KB4493510 NT AUTHORITY\SYSTEM 4/13/2019 12:00:00 AM COMPYX86 Security Update KB4493509 NT AUTHORITY\SYSTEM 4/13/2019 12:00:00 AM Side note to add links for that Get-Hotfix syntax at SS64 and Microsoft Module Documentation. There are some nice flags on there, especially for managing multiple machines. Oh hey. My laptop installed the same 5x updates and also prompted me with a 'Welcome to the October Update' banner in Edge. Keep in mind I deliberately change my home machines to Semi-Annual Channel that means 'ready for widespread use in organizations' instead of the default Advanced Updates setting for Semi-Annual Channel (Targeted) that means updates are ready for 'most people', as that namely tends to mean Public Test Channel. Perhaps one of those 5 updates invoked the compatibility check feature to run again but I will stop here for now, since that is a good jump off point.
  15. Just in case you stagger to this archive, I updated this to work with PHP 7.x as well. https://funtimebliss.com/plus/ Also, actual HouseOfPlus.com is under our control thanks to @Immortal Bob
  16. Pic0o

    FTB Archive

    Updating to PHP 7.x required me to retool the mysql commands to use mysqli. Also, I had some sloppy query order, where the database connector needed to be before the query call. All fixed up back in it's lo-fi glory. https://funtimebliss.com/archive/index.php
  17. I have some threads about removing metadata exif info from images with exiftool by Phil Harvey but if you try this on some .mp4 videos, you may find the details are still present. In this case, if you have a DJI drone, I highly suggest checking the exifdata. GPS and all sorts of other data is present and you may very well want to remove that from any content you upload. Remove exiftool metadata: ffmpeg -i in.mov -map_metadata -1 -c:v copy -c:a copy out.mov Once this finished, run Exiftool again to confirm that sea of data, is now much smaller than it was previously. For extra fun and confirmation, look for DJI_ images and have fun confirming GPS coordinates from those photos or videos to GPS coordinates on Google Maps.
  18. Additional resources, as I am trying to directly run grub-update on a Live USB install and seeing if that bodes well. https://bbs.archlinux.org/viewtopic.php?id=198547 https://truthseekers.io/everything-you-need-to-know-to-dual-boot-uefi-gpt-bios-mbr-partitions-swap-space-and-more/
  19. Howdy. I wanted to share some issues I am having installing Parrot OS. Long story short, the partition tables are being setup wrong to allow Grub to install. If you search for grub-efi-amd64 error you may see people suggest rebooting and selecting the non-UEFI usb boot to install. This turns out to fail too. What we want, is a drive with: 1st partition: bios_grub 2nd partition: boot, esp (EFI is esp based drive) Your OS partition and other partition choices are yours. Be it one for OS and another for Swap, or carve out a dedicated /home partition. A buddy told me a dedicated home partition makes life easier if you have a multi-boot linux environment config where you want home directory data to be shared between each Operating System running a Linux-based OS. This guide on is very nice and detailed partition layout and configuration for GRUB. It also show us how using gdisk or fdisk -l will show the defined parition configuration. gdisk -l /dev/yourDrive to cross reference the raw partition values to what you may see in Gparted. Partition results should look similar to below: Number Start (sector) End (sector) Size Code Name 1 2048 292863 142.0 MiB EF02 2 292864 2390015 1024.0 MiB EF00 3 2390016 275019775 130.0 GiB 8300 4 275019776 288692223 6.5 GiB 8200
  20. Scope of this forum changed to more of a break / fix troubleshooting focus. I will keep this thread stickied for that context and any other notes, since this was back from 2014 :D
  21. Pic0o

    SQLite crash-course

    Backstory for this thread is I have a project where I want to review SQLite data. SQLite is more a less, a compressed database in a flat-file. Usage tends to be for storing application data, especially in the case of mobile apps. In my case I wish to query quite a bit and to do so across multiple databases. As I have the most database experience in MsSQL, I am exporting data from SQLite so I can place it into a MsSQL Database for better querying and results. There are a few GUI tools for reviewing SQLite databases but if you want to collect data from them outside of their native application, this is where and why I am exporting and importing the data into Microsoft SQL Server. You could do the same with MySQL and your usage would be slightly different (in the case of using ` instead of ' [single quote]). So pick the database platform you are the most comfortable with or like more. Task 01: Reading the SQLite Database. You can open up the .sqlite in a text editor but as I noted it being compressed, your results will essentially be gibberish characters. While there are some plaintext values, we want the actual raw data set. This will look like your standard database dump / csv / tables view. Task 02: Running SQLite. Let's grab a download of the SQLite binary. Pick your OS of choice. In my case I am a Windows main user so I grabbed the sqlite-tools-win32-x86-3270100 windows binary and extracted it to a target folder. Once extracted we will see sqlite3.exe. Get used to running this, as this will get us into the SQLite console. Task 03: Reading the SQLite database(s). Starting off, let's grab a copy of the .sqlite file you want to read and paste a copy into your extracted SQLite tools folder. I tried full path loading to my sqlite data file but it was giving me issues. Instead of fighting with that, I just pasted a copy into the same folder as sqlite3.exe we will be running. This is a helpful document on the SQLite website for querying as well. Once your .sqlite file is in the same folder, bring up a command prompt (cmd.exe) into that folder. I recently learned a nice trick about getting a cmd prompt into a current folder in explorer. Browse to said folder and in the address bar, replace the filepath with 'cmd.exe' (without quotes) and you will get a command prompt into that folder. Saving you from changing your drive letter and folder path in the command prompt. In this cmd window, start by running sqlite3.exe. By doing so your console will change to sqlite> as you are now running sqlite. .help will give you all the available options. Below I will give you a cheat guide in the case of how to: Load a database, select a table, set your export mode and to export the table contents to a flat file! Yeet .open 'SQLite_DB_in_folder.sqlite' .tables .mode csv .header on .output filename.csv select * from table; .quit - For the above console / code example, we start by opening the .sqlite database file. - List the tables in said database. - Set our export mode to CSV. - Export with header / column names as first row. - Output results of next line query to target flat-file. - Enter the query with desired table from listed .tables results (You can review these in console by just typing select statement in console, before you enter the .output line). - .quit exits sqlite3.exe console. I suggest exiting after an export or your output file will remain in use by the sqlite3.exe console connection. Step 04: Review your output then import to MsSQL, etc. Open up your output .csv files and they should look like plaintext output. With that being the case you should be able to import them into the relational database system of your choice and go wild querying away! I should end noting you can also query from the SQLite console too, but since I am looking to compare a large amount of data from various databases, I will import these exported tables into one database on MsSQL with different tables for each. Note: Your exported .csv will NOT have column labels. It may be easier to just add them to the first line for each of your exports! Thanks for reading and have fun heccing all the things!
  22. Pic0o

    Forums back up

    So you may have noticed the forums were offline since the 13th of Feb or so. Long story short, my host changed their environment and denied any changes broke all the site database connections. After a few days of not helpful back and forth, the forum DB connection just started working again. Just wanted to mention in case you were trying to browse but either saw DB connection issues or a blank white page. March update. Had to bind correct PHPMulti config to allow forums to not white-page out again. πŸ™‚
  23. Pic0o

    GitHub I Has

    Hey yo, peep my soundcloud Github. https://github.com/botsama I do have some pretty dope playlists made on SoundCloud though. Haha. Most all of the PowerShell scripts, I showed and explained on ThugCrowd (Twitch). Get back episodes on your favorite podcast service! https://thugcrowd.com
  24. Encoding Guide. Overview on encoding video, stripping audio and preparing to submit a podcast. Prerequisites: Use whatever OS you like! I have encoded using the same utilities on Linux, but in this case I'm using Windows. Mac support should be comparable as well. mplayer ffmpeg your favorite text editor some patience while files encode A means to download source stream files. I am using Twitch Leecher in our case. Since I am talking about Twitch being our source file, I use Twitch Leecher to grab the raw .mp4 file from Twitch.tv servers. For point of reference your 720p video if it is 2 hours, it will be approximately 2.2 GB! Shit, that's a pretty big file. Your size to time ratio may vary but that puts into perspective the next step. Encoding to .avi files. Before we start, make sure you grabbed mplayer and ffmpeg. For the Windows heads, let's make this easy and pick a folder for encoding files. Let's say D:\encodes You can set paths and stuf for mencoder and ffmpeg, but let's be lazy and drop those extracted files into D:\encodes. As you may guess, we will also copy the raw .mp4 file we want to encode into the encodes folder too. Next step: let's prepare the encode scripts. Considering you might be doing this for more than one episode, let's just gear up to batch this process out for multiple files and to make your task easier, for each new episode. Pause for giving an overview of our process: Download the raw file Encode it with Xvid to trim some of the file size down Make an MP3 to strip the audio Run a maintenance task to make sure the timing index (You'll see why below) Upload your files somewhere for people to get them (Optional) Make an XML RSS Feed for your Podcast submissions Sample Windows Batch file to make an .Avi: @echo off echo Cooking it up mencoder "041_AndrewMorris_GreyNoise_io.mp4" -ovc xvid -xvidencopts bitrate=1800 -o "041_AndrewMorris_GreyNoise_io.avi" -oac mp3lame -lameopts abr:br=192 The 1st .mp4 is your source, I'm setting the bitrate for video to 1800 kbs, -o is outputting the encoded Xvid .avi and the the audio track is being encoded at 192 kbs bitrate for the same .avi output file. Neat. So now that we have a newly encoded .avi file. Be a good encoder and test it! Granted if one works, you should be golden for your other encodes. Remember, that's why we are scripting it too. Nice way to save some sanity while gaining consistency. This will not be an instantaneous process. I want to say my average FPS encoding is about 70 to 90 FPS when encoding the video. So be prepared for that. Next up: Let's cook up some tasty MP3s. In this batch script, we are going to extract the audio from the raw .mp4, but label it as fixTimings.mp3. Try to just run that encoded file and you will see the timing for the track is all broken and randomly changing. that may have been fixed in a later version of mencoder, but I call ffmpeg to fix it. @echo off echo Cooking it up mencoder "041_AndrewMorris_GreyNoise_io.mp4" -of rawaudio -oac mp3lame -lameopts abr:br=192 -ovc copy -o "041_AndrewMorris_GreyNoise_iofixTimings.mp3" echo Sync Audio ffmpeg -i "041_AndrewMorris_GreyNoise_iofixTimings.mp3" -acodec copy "041_AndrewMorris_GreyNoise_io.mp3" As you can see in the ffmpeg call, I use the source file with bad timings and make a corrected .mp3 with the proper time tables. Luckily, encoding just audio is crazy faster than doing video and audio. On an Intel i7-7700k setup I do about 550 FPS in respect to speeds. As I mentioned previously about the videos TEST YOUR OUTPUT FILES! Once you have the first few good, you should have no shock or issues processing later files. Getting into writing an RSS feed in XML: Let me stop here for now, as the next steps would be uploading your encoded files, writing a RSS feed in XML then submitting that to various podcast services (iTunes, Spotify, Google Podcasts). You can always view source of your favorite podcast (Duh, it should be ThugCrowd) and edit to your whim. While most web browsers do not display RSS feeds in a nice format anymore besides OG Firefox engine (IE: PaleMoon web browser), you will see the XML displayed that is key to being processed by the podcast services. None of the podcast services host your content, they basically point to your RSS XML feed and the file paths you specify for each episode. So you will want formidably reliable a host. As I mentioned, there are some specific tags for iTunes you should specify to make sure your podcast gets listed. Out of respect for your listeners, be sure to add the date of episode, file size and track length. It should also help you get listed since you gave good info out of the gate, before submission. Then when you have a new episode, just add a new Item block with the relevant criteria and you have updates or all your subscribers to know there is a new episode! Ok that is the end of this guide for now.
  25. It's that time of year. Shop for stuff, see a bunch of 'year in review' articles and keep an eye out for extra awful legislation to pass. Besides all that, 2018 is riding out to a close. Security wise, hmm. Diversify your portfolio. By that I mean make sure your credentials to login are really different between sites. Even if they are moderately doing password security correctly in the database, that doesn't say the rest of the login stack is authenticating properly or some component is not vulnerable to exploit. Just for the sake of repeating it, be sure to contaminate data you give to free sites and not like your real name, or other identifiable attributes. Even if that company is not intending to correlate all that information to build a profile about you, surely some scammer and or data analytic company is looking to. This year has been stressful but fun. Considering I haven't been a teen for a little bit of time (hahaha), the old scope of life is a little more complex than to wake up, do as I wish, have no obligations, etc. On a fun note I have been hanging out on Discord servers and chatting up with people and being on some Stream chats. ThugCrowd is one of the main places to find me as I also am the AV person who archives the show episodes, in case of issue with twitch or if you just want them to play offline, by video or mp3. There are some very cool and smart people on that server, I'm happy to be around these chill people. I also lurk around The Many Hats Club. Good people to be found there as well, just many many more people and a faster chat. Final plug for ThugCrowd, is to peep the archives. I like archiving stuff, so it gave me a chance to normalize some of my scripts. Other stuff? I noticed today I am pretty bad putting content on the front page of the site, so I will back fill October post to the Wordpress front end today. :p I still drive a good bit, dabble in powershell and like setting up raspberry pi devices for various purposes. Sometimes I sleep too, well ok, I take naps.
  26. Pic0o

    Diablo III Thread

    Holy shit?! 2011 (more a less besides the Mac Mini testing) was the last time this thread had a post? I guess that explains why I am loving this Switch Version of Diablo 3. If you expected a time I pushed people to get a switch, portable Diablo 3 is a damn fine example of it. 4 player online or local Co-Op. Button config is smooth as silk versus clicking a ton of shit like a mad person. Thread bump as I am playing me some Diablo 3. Your PC characters are separate from the console version, so just so you know as to not having an ungodly high level paragon, out the gate playing this version. All the updates and game mechanic changes are pretty damn good!
  1. Load more activity
  • Newsletter

    Want to keep up to date with all our latest news and information?
    Sign Up
  • Create New...