Jump to content
FunTimeBliss ::

All Activity

This stream auto-updates     

  1. Earlier
  2. Additional resources, as I am trying to directly run grub-update on a Live USB install and seeing if that bodes well. https://bbs.archlinux.org/viewtopic.php?id=198547 https://truthseekers.io/everything-you-need-to-know-to-dual-boot-uefi-gpt-bios-mbr-partitions-swap-space-and-more/
  3. Howdy. I wanted to share some issues I am having installing Parrot OS. Long story short, the partition tables are being setup wrong to allow Grub to install. If you search for grub-efi-amd64 error you may see people suggest rebooting and selecting the non-UEFI usb boot to install. This turns out to fail too. What we want, is a drive with: 1st partition: bios_grub 2nd partition: boot, esp (EFI is esp based drive) Your OS partition and other partition choices are yours. Be it one for OS and another for Swap, or carve out a dedicated /home partition. A buddy told me a dedicated home partition makes life easier if you have a multi-boot linux environment config where you want home directory data to be shared between each Operating System running a Linux-based OS. This guide on is very nice and detailed partition layout and configuration for GRUB. It also show us how using gdisk or fdisk -l will show the defined parition configuration. gdisk -l /dev/yourDrive to cross reference the raw partition values to what you may see in Gparted. Partition results should look similar to below: Number Start (sector) End (sector) Size Code Name 1 2048 292863 142.0 MiB EF02 2 292864 2390015 1024.0 MiB EF00 3 2390016 275019775 130.0 GiB 8300 4 275019776 288692223 6.5 GiB 8200
  4. Scope of this forum changed to more of a break / fix troubleshooting focus. I will keep this thread stickied for that context and any other notes, since this was back from 2014 :D
  5. Backstory for this thread is I have a project where I want to review SQLite data. SQLite is more a less, a compressed database in a flat-file. Usage tends to be for storing application data, especially in the case of mobile apps. In my case I wish to query quite a bit and to do so across multiple databases. As I have the most database experience in MsSQL, I am exporting data from SQLite so I can place it into a MsSQL Database for better querying and results. There are a few GUI tools for reviewing SQLite databases but if you want to collect data from them outside of their native application, this is where and why I am exporting and importing the data into Microsoft SQL Server. You could do the same with MySQL and your usage would be slightly different (in the case of using ` instead of ' [single quote]). So pick the database platform you are the most comfortable with or like more. Task 01: Reading the SQLite Database. You can open up the .sqlite in a text editor but as I noted it being compressed, your results will essentially be gibberish characters. While there are some plaintext values, we want the actual raw data set. This will look like your standard database dump / csv / tables view. Task 02: Running SQLite. Let's grab a download of the SQLite binary. Pick your OS of choice. In my case I am a Windows main user so I grabbed the sqlite-tools-win32-x86-3270100 windows binary and extracted it to a target folder. Once extracted we will see sqlite3.exe. Get used to running this, as this will get us into the SQLite console. Task 03: Reading the SQLite database(s). Starting off, let's grab a copy of the .sqlite file you want to read and paste a copy into your extracted SQLite tools folder. I tried full path loading to my sqlite data file but it was giving me issues. Instead of fighting with that, I just pasted a copy into the same folder as sqlite3.exe we will be running. This is a helpful document on the SQLite website for querying as well. Once your .sqlite file is in the same folder, bring up a command prompt (cmd.exe) into that folder. I recently learned a nice trick about getting a cmd prompt into a current folder in explorer. Browse to said folder and in the address bar, replace the filepath with 'cmd.exe' (without quotes) and you will get a command prompt into that folder. Saving you from changing your drive letter and folder path in the command prompt. In this cmd window, start by running sqlite3.exe. By doing so your console will change to sqlite> as you are now running sqlite. .help will give you all the available options. Below I will give you a cheat guide in the case of how to: Load a database, select a table, set your export mode and to export the table contents to a flat file! Yeet .open 'SQLite_DB_in_folder.sqlite' .tables .mode csv .header on .output filename.csv select * from table; .quit - For the above console / code example, we start by opening the .sqlite database file. - List the tables in said database. - Set our export mode to CSV. - Export with header / column names as first row. - Output results of next line query to target flat-file. - Enter the query with desired table from listed .tables results (You can review these in console by just typing select statement in console, before you enter the .output line). - .quit exits sqlite3.exe console. I suggest exiting after an export or your output file will remain in use by the sqlite3.exe console connection. Step 04: Review your output then import to MsSQL, etc. Open up your output .csv files and they should look like plaintext output. With that being the case you should be able to import them into the relational database system of your choice and go wild querying away! I should end noting you can also query from the SQLite console too, but since I am looking to compare a large amount of data from various databases, I will import these exported tables into one database on MsSQL with different tables for each. Note: Your exported .csv will NOT have column labels. It may be easier to just add them to the first line for each of your exports! Thanks for reading and have fun heccing all the things!
  6. Pic0o

    Forums back up

    So you may have noticed the forums were offline since the 13th of Feb or so. Long story short, my host changed their environment and denied any changes broke all the site database connections. After a few days of not helpful back and forth, the forum DB connection just started working again. Just wanted to mention in case you were trying to browse but either saw DB connection issues or a blank white page. March update. Had to bind correct PHPMulti config to allow forums to not white-page out again. 🙂
  7. Pic0o

    GitHub I Has

    Hey yo, peep my soundcloud Github. https://github.com/botsama I do have some pretty dope playlists made on SoundCloud though. Haha. Most all of the PowerShell scripts, I showed and explained on ThugCrowd (Twitch). Get back episodes on your favorite podcast service! https://thugcrowd.com
  8. Encoding Guide. Overview on encoding video, stripping audio and preparing to submit a podcast. Prerequisites: Use whatever OS you like! I have encoded using the same utilities on Linux, but in this case I'm using Windows. Mac support should be comparable as well. mplayer ffmpeg your favorite text editor some patience while files encode A means to download source stream files. I am using Twitch Leecher in our case. Since I am talking about Twitch being our source file, I use Twitch Leecher to grab the raw .mp4 file from Twitch.tv servers. For point of reference your 720p video if it is 2 hours, it will be approximately 2.2 GB! Shit, that's a pretty big file. Your size to time ratio may vary but that puts into perspective the next step. Encoding to .avi files. Before we start, make sure you grabbed mplayer and ffmpeg. For the Windows heads, let's make this easy and pick a folder for encoding files. Let's say D:\encodes You can set paths and stuf for mencoder and ffmpeg, but let's be lazy and drop those extracted files into D:\encodes. As you may guess, we will also copy the raw .mp4 file we want to encode into the encodes folder too. Next step: let's prepare the encode scripts. Considering you might be doing this for more than one episode, let's just gear up to batch this process out for multiple files and to make your task easier, for each new episode. Pause for giving an overview of our process: Download the raw file Encode it with Xvid to trim some of the file size down Make an MP3 to strip the audio Run a maintenance task to make sure the timing index (You'll see why below) Upload your files somewhere for people to get them (Optional) Make an XML RSS Feed for your Podcast submissions Sample Windows Batch file to make an .Avi: @echo off echo Cooking it up mencoder "041_AndrewMorris_GreyNoise_io.mp4" -ovc xvid -xvidencopts bitrate=1800 -o "041_AndrewMorris_GreyNoise_io.avi" -oac mp3lame -lameopts abr:br=192 The 1st .mp4 is your source, I'm setting the bitrate for video to 1800 kbs, -o is outputting the encoded Xvid .avi and the the audio track is being encoded at 192 kbs bitrate for the same .avi output file. Neat. So now that we have a newly encoded .avi file. Be a good encoder and test it! Granted if one works, you should be golden for your other encodes. Remember, that's why we are scripting it too. Nice way to save some sanity while gaining consistency. This will not be an instantaneous process. I want to say my average FPS encoding is about 70 to 90 FPS when encoding the video. So be prepared for that. Next up: Let's cook up some tasty MP3s. In this batch script, we are going to extract the audio from the raw .mp4, but label it as fixTimings.mp3. Try to just run that encoded file and you will see the timing for the track is all broken and randomly changing. that may have been fixed in a later version of mencoder, but I call ffmpeg to fix it. @echo off echo Cooking it up mencoder "041_AndrewMorris_GreyNoise_io.mp4" -of rawaudio -oac mp3lame -lameopts abr:br=192 -ovc copy -o "041_AndrewMorris_GreyNoise_iofixTimings.mp3" echo Sync Audio ffmpeg -i "041_AndrewMorris_GreyNoise_iofixTimings.mp3" -acodec copy "041_AndrewMorris_GreyNoise_io.mp3" As you can see in the ffmpeg call, I use the source file with bad timings and make a corrected .mp3 with the proper time tables. Luckily, encoding just audio is crazy faster than doing video and audio. On an Intel i7-7700k setup I do about 550 FPS in respect to speeds. As I mentioned previously about the videos TEST YOUR OUTPUT FILES! Once you have the first few good, you should have no shock or issues processing later files. Getting into writing an RSS feed in XML: Let me stop here for now, as the next steps would be uploading your encoded files, writing a RSS feed in XML then submitting that to various podcast services (iTunes, Spotify, Google Podcasts). You can always view source of your favorite podcast (Duh, it should be ThugCrowd) and edit to your whim. While most web browsers do not display RSS feeds in a nice format anymore besides OG Firefox engine (IE: PaleMoon web browser), you will see the XML displayed that is key to being processed by the podcast services. None of the podcast services host your content, they basically point to your RSS XML feed and the file paths you specify for each episode. So you will want formidably reliable a host. As I mentioned, there are some specific tags for iTunes you should specify to make sure your podcast gets listed. Out of respect for your listeners, be sure to add the date of episode, file size and track length. It should also help you get listed since you gave good info out of the gate, before submission. Then when you have a new episode, just add a new Item block with the relevant criteria and you have updates or all your subscribers to know there is a new episode! Ok that is the end of this guide for now.
  9. It's that time of year. Shop for stuff, see a bunch of 'year in review' articles and keep an eye out for extra awful legislation to pass. Besides all that, 2018 is riding out to a close. Security wise, hmm. Diversify your portfolio. By that I mean make sure your credentials to login are really different between sites. Even if they are moderately doing password security correctly in the database, that doesn't say the rest of the login stack is authenticating properly or some component is not vulnerable to exploit. Just for the sake of repeating it, be sure to contaminate data you give to free sites and not like your real name, or other identifiable attributes. Even if that company is not intending to correlate all that information to build a profile about you, surely some scammer and or data analytic company is looking to. This year has been stressful but fun. Considering I haven't been a teen for a little bit of time (hahaha), the old scope of life is a little more complex than to wake up, do as I wish, have no obligations, etc. On a fun note I have been hanging out on Discord servers and chatting up with people and being on some Stream chats. ThugCrowd is one of the main places to find me as I also am the AV person who archives the show episodes, in case of issue with twitch or if you just want them to play offline, by video or mp3. There are some very cool and smart people on that server, I'm happy to be around these chill people. I also lurk around The Many Hats Club. Good people to be found there as well, just many many more people and a faster chat. Final plug for ThugCrowd, is to peep the archives. I like archiving stuff, so it gave me a chance to normalize some of my scripts. Other stuff? I noticed today I am pretty bad putting content on the front page of the site, so I will back fill October post to the Wordpress front end today. :p I still drive a good bit, dabble in powershell and like setting up raspberry pi devices for various purposes. Sometimes I sleep too, well ok, I take naps.
  10. Pic0o

    Diablo III Thread

    Holy shit?! 2011 (more a less besides the Mac Mini testing) was the last time this thread had a post? I guess that explains why I am loving this Switch Version of Diablo 3. If you expected a time I pushed people to get a switch, portable Diablo 3 is a damn fine example of it. 4 player online or local Co-Op. Button config is smooth as silk versus clicking a ton of shit like a mad person. Thread bump as I am playing me some Diablo 3. Your PC characters are separate from the console version, so just so you know as to not having an ungodly high level paragon, out the gate playing this version. All the updates and game mechanic changes are pretty damn good!
  11. Pic0o

    Fallout 76

    I played the beta and wanted to share some of my experiences. It was only a few hours worth but I came to a decision. Canceled pre-order. Not to be dramatic about it, but reasons I canceled a pre-order and release are: Always online Fallout. I got the feeling people were shooting at me as I explored the starting content and I was right. Luckily you can sleep on a bed and not get killed by other actual players. Food and Water mechanic. So if I AFK, my character will need food and water and potentially die. Recovering items from your corpse. Building mechanics. VATS completely changed. Since always online, you cannot slow time and aim for parts. VATS now works like an enhanced aim, but with really limited functional use. While most of my reasons for disinterest are pretty popular in other games... I do not wanted a forced always online experience. Co-op as an option would be great but I am hard-passing on a persistent online environment where I have to manage food, water and getting player killed. Maybe if there is an offline option so I can enjoy the content and like actually pause, I would be interested. I am not a huge fan of MMO like mechanics in a game I feel is built for single player. Granted if I were playing an MMO, the PVP is established as a mechanic with some sort of consent, but if I were to AFK in a town playing something like a Final Fantasy or World of Warcraft, I would not have to feed a food and drink bar to stay alive. Survival mechanics as a game mechanic are not my speed. Hopefully you enjoyed my opinion and moderate rant about Fallout 76. Perhaps it helped you make some choices too. I like to pause and take IRL breaks, walk the dog etc.
  12. I explored a second world, bought the DLC compilation pack and am enjoying the exploration and story narrative. That being said, if you do not want the physical toy items, save some money and get the $60 digital version with most everything included, or the $80 digital version with all the pilots, weapons and ships. This is a pretty chill game. It's fun to play in bed or whatever. I'm going to play in tablet mode over break here in a little bit, since I tend to play docked off the TV. There are 2 ships that are physical exclusive and sold at Gamestop and Target stores. The Scramble ship is a repaint of the Pulse ship and the Cerberus is a repaint of the Lance ship. I will test tonight if they actually are different ships in-game. As that would make the total ships total be 8. Star Fox Airwing, Scramble and the Cerberus are the 3 exclusive ships. Star Fox being switch exclusive and the other 2 being retailer specific.
  13. Pic0o

    Merry Spooky!

    Hopefully you are having a nice Halloween! It's about mid-60 F temperature wise today. I will step out for a break shortly. Game wise, Fallout 76 had a Beta that was kind of a failure, since console and PC players had to download the same 45 GB multiple times. On the upswing, the next beta window is increased to be from 2pm EST until 11PM EST tomorrow on 2018/11/01. Most of the other ones are only in around 5 hour blocks, with a release on 11/14. Speaking of such, Bethesda has their own launcher client now, instead of you being able to just get the game on Steam. I guess they want in on those market demographics and not paying Steam a cut of sales. Tech wise, Hmmm hahaha. Red Hat was bought by IBM. Kind of big news as I figure that recurring service contract income is a big deal for IBM to be able to attach to their financials. Red Hat is only going to get bigger, well at least they were poised to before the purchase. As a mainline Windows systems person, you kind of have to have your head in the sand to deny Linux computing in the server stack. I have to say the inverse applies to thinking Microsoft is not going to remain in the business stack for some time as well. But enough of my ideologies. :) I recently got a better webcam so I can play Magic the Gathering with some buddies, using actual cards. It's a good time, I just tend to be out a bit between work and the occasional local tech event I like to turn up for. I also helped the missus install Windows 10 in a VM on her Manjaro install. She loathes windows so my powers grow stronger by enabling her to do so. Bwhahaha! Have a nice week! Oh yeah, we are on the 4th Nightmare on Elm Street film for our Halloween sessions. She is getting to enjoy them for the first time.
  14. Cost to content wise, the Digital versions give you the most content. You will need 12.7 GB or so on your SD card though. $ 60 version gets you 5x ships and other items, where as the $ 80 digital version gets you all the content. Physically, each ship is $ 25. You get a single weapon and a pilot with the ship. Weapon items come in a 2 pack and cost $ 10. Pilots are $ 8 each. If you start with the physical game, there is a DLC pack 1 that looks to be $ 60. this gets you all the ships, pilots and weapons. The physical edition is $ 75, so add another $60 for the DLC pack to bring you to $ 135 for all the content and the Star Fox toys to go with the controller ship adapter. Or you can get all the content digitally for $80. Even the $ 60 digital version gets you 5x ships, where as you get 2 with the Switch version or 1 with the Xbox / Playstation versions. Digital content can be bought individually, but each ship is about $13. So roughtly half the price as the physical items.
  15. I picked this up for Nintendo Switch this weekend. I got the physical edition with the Star Fox toys. Despite some hate the game gets, it is pretty fun and also looks quite nice. Do be warned it is mainly a toys to life game, so you can buy IRL ships, pilots and weapon toys, or go all digital and save some money. Common reference points are to No Mans Sky, but with story dialog instead of just exploration and minimal objectives. Fair disclaimer: I am only on the second world. I played a few hours and 100% explored the world 1 map. Extra ships do not appear unlockable from story, but are tied to the toy or digital dlc. Weapons may be the same case, but weapon power ups are found in-game. You can get some nice weapon power ups if you register to the Ubisoft Club site and use the gold coins you get from in-game objectives. I had plenty for the items from other Ubisoft games I have played. I may get to try out co-op as well soon. When I do, I will share details on that too. FYI. To use your second ship, remove the right joycon and it will put you into Digital mode, instead of Physical and you can fly other ships.
  16. Pic0o

    Pi-Hole

    Recounting if your Pihole has an issue getting gravity lists or updates in the OS and web interface for pihole. Check your router / firewall logs for ip mac binding errors. Since the device is likely setup with a static IP, you may want to be sure to add a rule if you see it in the logs. Updates should then work. Oh cool. Conditional Forwarding is a setting in the PiHole admin settings. If your Pi-Hole is not working as a dhcp server, your device names will not all be able to be read from your router. Accurate in my instance. Yay updated interface!
  17. nouveau.modeset=0 https://www.youtube.com/watch?v=-3aDEVHtA7M&list=PLItvWBLwYxo47PtOwgvqdjfnq9in05yoM
  18. This thread is more like my personal notes than a guide. Especially since the consistent file path variable was something I recently got my head around. Hopefully that made sense in it's current format. I wanted to mention you will get limited results if you do not run the Powershell from an administrator-level escalated prompt. Otherwise stuff like the Get-ScheduledTask will not show all jobs on said machine.
  19. Pic0o

    July greetings

    I consolidated the Projects forum into Break / Fix. It gives a cleaner read of the forum and threads that way. Hardware is still top and separate, as to avoid flooding out threads.
  20. I have been doing a bit of powershell to configure and interact with various Windows versions. I built up some core scripts to use as my own kind of workshop for system review and administration. I wanted to drop an example script to chat about. One of the things I struggled to understand starting out was string substitution and being able to define a variable that would also consistently output to a file path of my choosing. TL;DR on that resolution is to wrap the other variable you are calling (example: file paths) in a $() block. As seen below, I call my Computername environmental variable so it can be used in the output of file names and logs. # getEventLogs: Maintenance collection script. $boxName = $env:COMPUTERNAME $outEvt01 = ".\$($boxName)_EventLog_Apps.csv" $outEvt02 = ".\$($boxName)_EventLog_System.csv" $outSvc01 = ".\$($boxName)_Service-RunStates.log" $outPorts01 = ".\$($boxName)_Network-Ports.log" $outTask01 = ".\$($boxName)_Tasklist.log" $outSchTsk01 = ".\$($boxName)_Scheduled-Tasks.log" Filter timestamp {"Logs collected at $(Get-Date -Format "yyyy-MM-dd HH mm ss")"} # Application Event Log most recent 100 messages. Get-EventLog application -newest 100 | Export-Csv $outEvt01 timestamp | Out-File -Append $outEvt01 -Encoding ASCII Get-EventLog system -newest 100 | Export-Csv $outEvt02 timestamp | Out-File -Append $outEvt02 -Encoding ASCII # Collect service list and current state of each. Get-Service | Sort-Object status | Format-Table -AutoSize | Out-File $outSvc01 timestamp | Out-File -Append $outSvc01 # Get process list with relevant details at time of script exec. cmd /c netstat -aon > $outPorts01 timestamp | Out-File -Append $outPorts01 cmd /c tasklist > $outTask01 timestamp | Out-File -Append $outTask01 Get-ScheduledTask | Select TaskName, State, TaskPath | Sort-Object -Property TaskPath | Format-table -wrap | Out-File $outSchTsk01 timestamp | Out-File -Append $outSchTsk01 # Wrap all these output into update state / append single file. # Stamp date and Time into said merged output. Starting out at the top, I defining a variable for the powershell equivalent of environmental variables in the OS like %computername%. Trust me here, you don't want to try and call a %variable% in a powershell script. That's what line 1 is for. Each of the following defined variables are my output paths for the collections. I use .csv exports for larger data sets, since the default Table outputs can heavily chop data to fit the terminal output. Brief OCD DBA note. Being a fan of Databases and Microsoft SQL, I really value a good | (pipe) to run: | Select * after a command. You can filter that raw output for fields you want to have outputted by writing a custom Select pipe. There is an example of that for Scheduled Tasks, I just wanted to word out the logic as that took me some time to figure out that is how I can see what my options are for selecting output fields. The other variables for file path are so I do not have to add the same string twice or more. As you can see on the actual commands, I add an Out-File -Append to insert the Date string to each file. Filter timestamp is my means for defining the date output string. That time will be for when the script is run, so each file will have a matching output time. Think of filter in this context as an easier Function. The rest of the script uses either Powershell cmdlets or OS level commands to obtain the data I am looking for and saving to the output files. I experimented both ways to see what output best matches the task and output I want to work with. The Export Events logs are pretty simple in calling the 100 most recent events, saving that to a .csv, then adding the Date string at the end of said file. Service list is sorted and exported to a .log file with the Date string added (as the date will be added for the other 4 output files as well). ' cmd /c ' calls a windows command but ignores keywords for powershell on that line. Huge helpful thing to know when trying to process content by use of an OS-level command. Otherwise you will see really esoteric issues you would rather not have to figure out the secret means of why they are failing. cmd /c is quite nice. FYI. Neat. We are at the part I rambled above in relation to databases and filtering content. I did not need many of the details in the raw output from showing all the parameters of that Powershell cmdlet. Selecting the relevant fields, I then sort based on the TaskPath field (to put the non-OS tasks first in the list), apply a -wrap text for the Format-Table output of that cmdlet, then output the data into a local file. I have done some scripts with loop and condition evaluations but I will stop here for the moment. If you want to gather some information about an environment, hopefully this example gets you in the right direction for your data collections. Let me end with a link to a great resource. SS64 has some good resources and examples. They have been very helpful in conjunction with the Windows Powershell manuals.
  21. Pic0o

    Profile fields

    I spruced up the profile fields to be in this century. I also removed stuff like AIM, MSN and ICQ. All of those are dead, well short of ICQ that last I loaded, was some weird bot advertisement network. Current version Invision Boards is nice for managing all this stuff. Even the items I added a long time ago, are well accessible in current version. Oh yes, I condensed the subforums in here too. I had too much shit in here that no one really used. I nested the old ones under the Welcome forum.
  22. J0k3r

    2018 D33'z Build

    As far as adding you to Oculus, its going to be awhile... Once I fire up the machine I will need a day to get it updated. Also another concern is that the MOBO will require a bios update to run the chip... How do I do this with only having the 1 chip that requires the BIOS update to run? My hope is that it will still POST, if not my options become pretty limited...
  23. J0k3r

    2018 D33'z Build

    y0 y0 So far the build has went as normal as I would suspect. The problem is now I must commit and pull the PS from this machine in order to power the new rig due to me opting to save some coins and recycle. I mean wtf right? My PS is still a 1600w PS so should be good and the connectors look aces so far as well. The biggest change is the socket... I mean you have to love a CPU that comes with a torque wrench pre-set for the proper torque to SCREW down your chip to the socket LOL no more weak ass plastic levers! My backup plan is an emergency BestBuy stop 😛
  24. Just bumping to note I still jump into some VR content but certainly not like 10+ hours a week, lately. I do quite enjoy moving my head to wall-clip objects. The SteamVR environment of the Steam offices are really good for that, you can even find some partially rendered hidden rooms. I tried the Budget Cuts demo but the controls on Oculus turned me off. I did however grab Beat Saber and enjoyed a good bit of that. Rumor has it someone else recently joined the Rift VR club, so I will be exploring more of their multi-player offerings, since I don't have to rely on the kindness and patience of randoms. BigScreen VR is still really cool, even with the Oculus 2.0 Home updates. I am looking forward to hosting some video sessions and the like. Speaking of the 2.0 Oculus beta stuff, they added quite a bit of new environmental objects here over the last month or so. As I mentioned on the phone, avoid the 2.0 update if you are on older hardware and the Oculus non-beta software is already yelling at you about not meeting requirements. I put some hours into the VR MMO called OrbusVR. For lulz we can also jump into the hive of villiany and memes that is VRChat. VRChat being free and OrbusVR going for $30 or less if on a sale. I'm not sure if it improved, but as an Oculus user, I always go for a game from the Oculus Store, instead of Steam. The performance tends to be better and you avoid glitches like the ceiling height mapping that can easily get broken in SteamVR.
  25. Pic0o

    2018 D33'z Build

    Looks like tonight is the night, so long as I don't boost all your new kit off your porch! Add me in Oculus. I'll find my info and how all that friending stuff works. That build looks insane.
  26. J0k3r

    2018 D33'z Build

    Attached is the specs of the new build! Last full build was Sept 2011 (In a older post here) CPU: AMD Ryzen Threadripper 2950X Processor (YD295XA8AFWOF) COOLING: Thermaltake Water 3.0 Ultimate 360mm Aluminum Radiator Triple Curve Fans AIO Enthusiast Liquid Cooling System CPU Cooler MEM: Corsair Vengeance RGB PRO 32GB (4x8GB) DDR4 3600MHz C18 LED Desktop Memory MOBO: ASUS ROG STRIX X399-E GAMING AMD Ryzen Threadripper TR4 DDR4 M.2 U.2 X399 EATX HEDT Motherboard with onboard 802.11AC WiFi, USB 3.1 Gen2, and AURA Sync RGB Lighting HD: Samsung 970 EVO 1TB - NVMe PCIe M.2 2280 SSD (MZ-V7E1T0BW)
  1. Load more activity
  • Newsletter

    Want to keep up to date with all our latest news and information?

    Sign Up
×
×
  • Create New...