May 06 17:10:26.241926 ip-10-0-131-115 systemd[1]: Starting Kubernetes Kubelet... May 06 17:10:26.678557 ip-10-0-131-115 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 06 17:10:26.678557 ip-10-0-131-115 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. May 06 17:10:26.678557 ip-10-0-131-115 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 06 17:10:26.678557 ip-10-0-131-115 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 06 17:10:26.678557 ip-10-0-131-115 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 06 17:10:26.680901 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.680815 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 06 17:10:26.685644 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685623 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 06 17:10:26.685644 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685641 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 06 17:10:26.685644 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685645 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 06 17:10:26.685644 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685648 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685651 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685654 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685657 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685660 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685663 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685666 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685669 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685671 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685674 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685677 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685680 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685682 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685685 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685688 2578 feature_gate.go:328] unrecognized feature gate: NewOLM May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685692 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685696 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685698 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685701 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 06 17:10:26.685809 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685704 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685706 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685709 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685711 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685714 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685723 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685726 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685729 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685731 2578 feature_gate.go:328] unrecognized feature gate: Example May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685734 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685736 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685740 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685742 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685745 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685748 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685751 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685754 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685757 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685760 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685762 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 06 17:10:26.686278 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685766 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685770 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685773 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685777 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685780 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685783 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685786 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685789 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685792 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685794 2578 feature_gate.go:328] unrecognized feature gate: DualReplica May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685797 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685800 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685803 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685805 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685808 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685811 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685813 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685816 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685819 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685821 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 06 17:10:26.686818 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685824 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685826 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685829 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685832 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685834 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685836 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685839 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685842 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685845 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685848 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685852 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685854 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685857 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685860 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685863 2578 feature_gate.go:328] unrecognized feature gate: Example2 May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685865 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685869 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685871 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685874 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685877 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 06 17:10:26.687386 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685881 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685884 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685886 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.685889 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686294 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686300 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686302 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686305 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686308 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686311 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686314 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686317 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686320 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686322 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686325 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686328 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686331 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686333 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686336 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686339 2578 feature_gate.go:328] unrecognized feature gate: NewOLM May 06 17:10:26.687888 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686341 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686344 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686347 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686351 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686355 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686358 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686361 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686364 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686367 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686370 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686372 2578 feature_gate.go:328] unrecognized feature gate: DualReplica May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686375 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686377 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686380 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686383 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686385 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686388 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686390 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686393 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 06 17:10:26.688367 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686395 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686398 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686401 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686404 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686406 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686409 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686411 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686414 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686417 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686419 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686421 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686424 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686427 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686430 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686432 2578 feature_gate.go:328] unrecognized feature gate: Example May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686435 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686438 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686442 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686445 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686447 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 06 17:10:26.688860 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686450 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686452 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686455 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686458 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686460 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686463 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686465 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686468 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686470 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686472 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686475 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686478 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686481 2578 feature_gate.go:328] unrecognized feature gate: Example2 May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686483 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686486 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686488 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686490 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686493 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686495 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686498 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 06 17:10:26.689383 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686500 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686502 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686505 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686507 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686513 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686516 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686518 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686521 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686523 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686526 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.686528 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687740 2578 flags.go:64] FLAG: --address="0.0.0.0" May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687749 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687757 2578 flags.go:64] FLAG: --anonymous-auth="true" May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687761 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687766 2578 flags.go:64] FLAG: --authentication-token-webhook="false" May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687769 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687774 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687778 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687782 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687785 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" May 06 17:10:26.689898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687788 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687792 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687795 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687798 2578 flags.go:64] FLAG: --cgroup-root="" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687801 2578 flags.go:64] FLAG: --cgroups-per-qos="true" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687804 2578 flags.go:64] FLAG: --client-ca-file="" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687806 2578 flags.go:64] FLAG: --cloud-config="" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687809 2578 flags.go:64] FLAG: --cloud-provider="external" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687812 2578 flags.go:64] FLAG: --cluster-dns="[]" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687817 2578 flags.go:64] FLAG: --cluster-domain="" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687820 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687823 2578 flags.go:64] FLAG: --config-dir="" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687826 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687829 2578 flags.go:64] FLAG: --container-log-max-files="5" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687833 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687841 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687844 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687847 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687851 2578 flags.go:64] FLAG: --contention-profiling="false" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687854 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687857 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687860 2578 flags.go:64] FLAG: --cpu-manager-policy="none" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687863 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687867 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687870 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" May 06 17:10:26.690414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687873 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687875 2578 flags.go:64] FLAG: --enable-load-reader="false" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687879 2578 flags.go:64] FLAG: --enable-server="true" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687882 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687886 2578 flags.go:64] FLAG: --event-burst="100" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687889 2578 flags.go:64] FLAG: --event-qps="50" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687893 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687896 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687899 2578 flags.go:64] FLAG: --eviction-hard="" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687903 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687906 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687909 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687912 2578 flags.go:64] FLAG: --eviction-soft="" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687915 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687918 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687921 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687923 2578 flags.go:64] FLAG: --experimental-mounter-path="" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687926 2578 flags.go:64] FLAG: --fail-cgroupv1="false" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687929 2578 flags.go:64] FLAG: --fail-swap-on="true" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687932 2578 flags.go:64] FLAG: --feature-gates="" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687936 2578 flags.go:64] FLAG: --file-check-frequency="20s" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687939 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687942 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687946 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687949 2578 flags.go:64] FLAG: --healthz-port="10248" May 06 17:10:26.691033 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687952 2578 flags.go:64] FLAG: --help="false" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687954 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-131-115.ec2.internal" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687958 2578 flags.go:64] FLAG: --housekeeping-interval="10s" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687961 2578 flags.go:64] FLAG: --http-check-frequency="20s" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687964 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687967 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687971 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687973 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687976 2578 flags.go:64] FLAG: --image-service-endpoint="" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687979 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687982 2578 flags.go:64] FLAG: --kube-api-burst="100" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687985 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687988 2578 flags.go:64] FLAG: --kube-api-qps="50" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687991 2578 flags.go:64] FLAG: --kube-reserved="" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687993 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687996 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.687999 2578 flags.go:64] FLAG: --kubelet-cgroups="" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688002 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688005 2578 flags.go:64] FLAG: --lock-file="" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688007 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688010 2578 flags.go:64] FLAG: --log-flush-frequency="5s" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688014 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688019 2578 flags.go:64] FLAG: --log-json-split-stream="false" May 06 17:10:26.691654 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688022 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688025 2578 flags.go:64] FLAG: --log-text-split-stream="false" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688027 2578 flags.go:64] FLAG: --logging-format="text" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688031 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688034 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688037 2578 flags.go:64] FLAG: --manifest-url="" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688040 2578 flags.go:64] FLAG: --manifest-url-header="" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688044 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688047 2578 flags.go:64] FLAG: --max-open-files="1000000" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688051 2578 flags.go:64] FLAG: --max-pods="110" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688054 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688057 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688060 2578 flags.go:64] FLAG: --memory-manager-policy="None" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688063 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688066 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688069 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688072 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688080 2578 flags.go:64] FLAG: --node-status-max-images="50" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688083 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688086 2578 flags.go:64] FLAG: --oom-score-adj="-999" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688092 2578 flags.go:64] FLAG: --pod-cidr="" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688095 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3fc6c2cc09f271efd3cd2adb6c984c7cab48ea53dad824c952dee91afa8eaa20" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688100 2578 flags.go:64] FLAG: --pod-manifest-path="" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688104 2578 flags.go:64] FLAG: --pod-max-pids="-1" May 06 17:10:26.692214 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688107 2578 flags.go:64] FLAG: --pods-per-core="0" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688110 2578 flags.go:64] FLAG: --port="10250" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688112 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688115 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c2c47ed7fff2791c" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688118 2578 flags.go:64] FLAG: --qos-reserved="" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688122 2578 flags.go:64] FLAG: --read-only-port="10255" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688125 2578 flags.go:64] FLAG: --register-node="true" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688128 2578 flags.go:64] FLAG: --register-schedulable="true" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688130 2578 flags.go:64] FLAG: --register-with-taints="" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688134 2578 flags.go:64] FLAG: --registry-burst="10" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688137 2578 flags.go:64] FLAG: --registry-qps="5" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688140 2578 flags.go:64] FLAG: --reserved-cpus="" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688142 2578 flags.go:64] FLAG: --reserved-memory="" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688146 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688149 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688153 2578 flags.go:64] FLAG: --rotate-certificates="false" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688155 2578 flags.go:64] FLAG: --rotate-server-certificates="false" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688158 2578 flags.go:64] FLAG: --runonce="false" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688161 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688164 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688167 2578 flags.go:64] FLAG: --seccomp-default="false" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688170 2578 flags.go:64] FLAG: --serialize-image-pulls="true" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688173 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688175 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688178 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688181 2578 flags.go:64] FLAG: --storage-driver-password="root" May 06 17:10:26.692807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688184 2578 flags.go:64] FLAG: --storage-driver-secure="false" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688187 2578 flags.go:64] FLAG: --storage-driver-table="stats" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688190 2578 flags.go:64] FLAG: --storage-driver-user="root" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688193 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688196 2578 flags.go:64] FLAG: --sync-frequency="1m0s" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688199 2578 flags.go:64] FLAG: --system-cgroups="" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688201 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688206 2578 flags.go:64] FLAG: --system-reserved-cgroup="" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688209 2578 flags.go:64] FLAG: --tls-cert-file="" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688212 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688217 2578 flags.go:64] FLAG: --tls-min-version="" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688219 2578 flags.go:64] FLAG: --tls-private-key-file="" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688222 2578 flags.go:64] FLAG: --topology-manager-policy="none" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688225 2578 flags.go:64] FLAG: --topology-manager-policy-options="" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688228 2578 flags.go:64] FLAG: --topology-manager-scope="container" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688231 2578 flags.go:64] FLAG: --v="2" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688235 2578 flags.go:64] FLAG: --version="false" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688239 2578 flags.go:64] FLAG: --vmodule="" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688243 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.688246 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688367 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688371 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688376 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688380 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 06 17:10:26.693423 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688383 2578 feature_gate.go:328] unrecognized feature gate: NewOLM May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688386 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688389 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688392 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688394 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688397 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688402 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688405 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688409 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688411 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688414 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688419 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688422 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688424 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688427 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688430 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688433 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688435 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688438 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 06 17:10:26.694044 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688441 2578 feature_gate.go:328] unrecognized feature gate: Example2 May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688445 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688447 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688450 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688452 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688455 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688458 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688460 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688463 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688465 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688468 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688471 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688473 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688476 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688478 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688481 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688483 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688486 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688488 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688491 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 06 17:10:26.694527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688493 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688495 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688498 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688500 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688503 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688506 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688508 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688511 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688513 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688516 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688518 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688521 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688523 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688527 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688530 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688532 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688535 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688537 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688540 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688542 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 06 17:10:26.695035 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688544 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688547 2578 feature_gate.go:328] unrecognized feature gate: Example May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688550 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688552 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688555 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688557 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688560 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688562 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688565 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688567 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688570 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688572 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688574 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688577 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688593 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688596 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688598 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688601 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688603 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688606 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 06 17:10:26.695527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688609 2578 feature_gate.go:328] unrecognized feature gate: DualReplica May 06 17:10:26.696038 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688611 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 06 17:10:26.696038 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.688614 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 06 17:10:26.696038 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.689512 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 06 17:10:26.697147 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.697127 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.10" May 06 17:10:26.697184 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.697147 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 06 17:10:26.697214 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697198 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 06 17:10:26.697214 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697204 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 06 17:10:26.697214 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697208 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 06 17:10:26.697214 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697211 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 06 17:10:26.697214 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697214 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697218 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697221 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697224 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697226 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697229 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697232 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697234 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697237 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697240 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697242 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697245 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697247 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697250 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697252 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697255 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697257 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697260 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697263 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697265 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 06 17:10:26.697339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697268 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697271 2578 feature_gate.go:328] unrecognized feature gate: DualReplica May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697274 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697277 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697279 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697282 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697284 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697287 2578 feature_gate.go:328] unrecognized feature gate: NewOLM May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697290 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697292 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697295 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697297 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697300 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697302 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697306 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697309 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697311 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697314 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697317 2578 feature_gate.go:328] unrecognized feature gate: Example May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697319 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 06 17:10:26.697842 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697322 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697324 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697327 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697329 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697332 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697335 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697337 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697340 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697343 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697346 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697348 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697351 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697354 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697356 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697361 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697365 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697367 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697370 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697373 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697376 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 06 17:10:26.698325 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697378 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697381 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697384 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697387 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697389 2578 feature_gate.go:328] unrecognized feature gate: Example2 May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697392 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697396 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697398 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697401 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697404 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697406 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697409 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697412 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697415 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697417 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697420 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697422 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697425 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697427 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 06 17:10:26.698827 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697430 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697434 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697438 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.697443 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697542 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697547 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697551 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697554 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697558 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697560 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697563 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697566 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697569 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697572 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697575 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 06 17:10:26.699366 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697578 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697596 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697599 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697602 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697605 2578 feature_gate.go:328] unrecognized feature gate: DualReplica May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697608 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697610 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697613 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697616 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697619 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697621 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697624 2578 feature_gate.go:328] unrecognized feature gate: NewOLM May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697626 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697629 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697631 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697634 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697636 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697639 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697641 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697643 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 06 17:10:26.699768 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697646 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697649 2578 feature_gate.go:328] unrecognized feature gate: Example May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697651 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697654 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697657 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697659 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697663 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697666 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697668 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697671 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697674 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697676 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697679 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697681 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697684 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697686 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697689 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697691 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697693 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697696 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 06 17:10:26.700254 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697698 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697701 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697703 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697706 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697708 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697711 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697713 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697716 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697718 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697721 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697723 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697726 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697729 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697731 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697734 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697736 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697739 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697742 2578 feature_gate.go:328] unrecognized feature gate: Example2 May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697744 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697749 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 06 17:10:26.700811 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697752 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697755 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697758 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697762 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697766 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697769 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697771 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697774 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697777 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697779 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697781 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697784 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697786 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697789 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:26.697791 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 06 17:10:26.701293 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.697796 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 06 17:10:26.701674 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.698474 2578 server.go:962] "Client rotation is on, will bootstrap in background" May 06 17:10:26.701674 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.701313 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" May 06 17:10:26.702642 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.702630 2578 server.go:1019] "Starting client certificate rotation" May 06 17:10:26.702745 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.702729 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" May 06 17:10:26.702778 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.702773 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" May 06 17:10:26.725478 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.725459 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" May 06 17:10:26.727885 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.727857 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" May 06 17:10:26.744477 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.744455 2578 log.go:25] "Validated CRI v1 runtime API" May 06 17:10:26.751385 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.751369 2578 log.go:25] "Validated CRI v1 image API" May 06 17:10:26.752642 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.752622 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 06 17:10:26.754917 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.754896 2578 fs.go:135] Filesystem UUIDs: map[3184d69d-9647-4609-98d3-dcad409c178c:/dev/nvme0n1p3 623c384a-8410-416a-8ca9-7625ce2ae413:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] May 06 17:10:26.754990 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.754916 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] May 06 17:10:26.756134 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.755991 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" May 06 17:10:26.760773 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.760655 2578 manager.go:217] Machine: {Timestamp:2026-05-06 17:10:26.758653647 +0000 UTC m=+0.400051235 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101968 MemoryCapacity:33164500992 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f60bd6326f1f1556aa538f74087fe SystemUUID:ec2f60bd-6326-f1f1-556a-a538f74087fe BootID:969b860e-8496-4a94-a9e7-47fb0809bc12 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582250496 Type:vfs Inodes:4048401 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:cd:e0:05:b0:4f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:cd:e0:05:b0:4f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:62:7d:99:a9:71:df Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164500992 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} May 06 17:10:26.760773 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.760762 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. May 06 17:10:26.760935 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.760875 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.112.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260504-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} May 06 17:10:26.763306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.763278 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 06 17:10:26.763479 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.763308 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-115.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 06 17:10:26.763556 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.763493 2578 topology_manager.go:138] "Creating topology manager with none policy" May 06 17:10:26.763556 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.763506 2578 container_manager_linux.go:306] "Creating device plugin manager" May 06 17:10:26.763556 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.763524 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" May 06 17:10:26.765037 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.765024 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" May 06 17:10:26.766446 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.766434 2578 state_mem.go:36] "Initialized new in-memory state store" May 06 17:10:26.766571 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.766560 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" May 06 17:10:26.768931 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.768920 2578 kubelet.go:491] "Attempting to sync node with API server" May 06 17:10:26.768995 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.768943 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" May 06 17:10:26.768995 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.768960 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" May 06 17:10:26.768995 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.768975 2578 kubelet.go:397] "Adding apiserver pod source" May 06 17:10:26.769116 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.769022 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 06 17:10:26.770155 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.770142 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" May 06 17:10:26.770217 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.770178 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" May 06 17:10:26.772825 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.772808 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.11-2.rhaos4.20.gitb2a8320.el9" apiVersion="v1" May 06 17:10:26.774068 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.774055 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 06 17:10:26.774890 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.774879 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" May 06 17:10:26.774966 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.774897 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" May 06 17:10:26.774966 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.774903 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" May 06 17:10:26.774966 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.774908 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" May 06 17:10:26.774966 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.774914 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" May 06 17:10:26.774966 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.774920 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" May 06 17:10:26.774966 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.774926 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" May 06 17:10:26.774966 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.774931 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" May 06 17:10:26.774966 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.774939 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" May 06 17:10:26.774966 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.774945 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" May 06 17:10:26.774966 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.774953 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" May 06 17:10:26.775276 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.775268 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" May 06 17:10:26.776046 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.776034 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" May 06 17:10:26.776046 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.776045 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" May 06 17:10:26.780084 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.780070 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 06 17:10:26.780162 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.780106 2578 server.go:1295] "Started kubelet" May 06 17:10:26.781048 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.780993 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 06 17:10:26.781115 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.781080 2578 server_v1.go:47] "podresources" method="list" useActivePods=true May 06 17:10:26.781412 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.781384 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 06 17:10:26.782264 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.782245 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 06 17:10:26.782895 ip-10-0-131-115 systemd[1]: Started Kubernetes Kubelet. May 06 17:10:26.783448 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.783422 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-115.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope May 06 17:10:26.783605 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:26.783486 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 06 17:10:26.783605 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:26.783457 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-115.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" May 06 17:10:26.785318 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.785301 2578 server.go:317] "Adding debug handlers to kubelet server" May 06 17:10:26.786168 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.786149 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lm8rl" May 06 17:10:26.789634 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.789613 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 06 17:10:26.789634 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.789630 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" May 06 17:10:26.790380 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:26.790355 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:26.790459 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:26.789392 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-115.ec2.internal.18ad09180cd0b24a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-115.ec2.internal,UID:ip-10-0-131-115.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-115.ec2.internal,},FirstTimestamp:2026-05-06 17:10:26.780082762 +0000 UTC m=+0.421480351,LastTimestamp:2026-05-06 17:10:26.780082762 +0000 UTC m=+0.421480351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-115.ec2.internal,}" May 06 17:10:26.790696 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.790682 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 06 17:10:26.790802 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.790686 2578 volume_manager.go:295] "The desired_state_of_world populator starts" May 06 17:10:26.790802 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.790711 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" May 06 17:10:26.790906 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.790865 2578 reconstruct.go:97] "Volume reconstruction finished" May 06 17:10:26.790906 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.790876 2578 reconciler.go:26] "Reconciler: start to sync state" May 06 17:10:26.790978 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.790921 2578 factory.go:55] Registering systemd factory May 06 17:10:26.790978 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.790943 2578 factory.go:223] Registration of the systemd container factory successfully May 06 17:10:26.791136 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.791117 2578 factory.go:153] Registering CRI-O factory May 06 17:10:26.791136 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.791127 2578 factory.go:223] Registration of the crio container factory successfully May 06 17:10:26.791236 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.791171 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory May 06 17:10:26.791236 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.791186 2578 factory.go:103] Registering Raw factory May 06 17:10:26.791236 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.791196 2578 manager.go:1196] Started watching for new ooms in manager May 06 17:10:26.791547 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.791532 2578 manager.go:319] Starting recovery of all containers May 06 17:10:26.791980 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.791961 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lm8rl" May 06 17:10:26.792946 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:26.792921 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" May 06 17:10:26.799620 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.799600 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" May 06 17:10:26.802047 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:26.802029 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-115.ec2.internal\" not found" node="ip-10-0-131-115.ec2.internal" May 06 17:10:26.802760 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.802745 2578 manager.go:324] Recovery completed May 06 17:10:26.806392 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.806363 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 06 17:10:26.808888 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.808874 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasSufficientMemory" May 06 17:10:26.808945 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.808902 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasNoDiskPressure" May 06 17:10:26.808945 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.808911 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasSufficientPID" May 06 17:10:26.809378 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.809363 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" May 06 17:10:26.809422 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.809378 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" May 06 17:10:26.809422 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.809396 2578 state_mem.go:36] "Initialized new in-memory state store" May 06 17:10:26.811986 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.811972 2578 policy_none.go:49] "None policy: Start" May 06 17:10:26.811986 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.811989 2578 memory_manager.go:186] "Starting memorymanager" policy="None" May 06 17:10:26.812087 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.812000 2578 state_mem.go:35] "Initializing new in-memory state store" May 06 17:10:26.856127 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.856110 2578 manager.go:341] "Starting Device Plugin manager" May 06 17:10:26.856278 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:26.856138 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 06 17:10:26.856278 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.856147 2578 server.go:85] "Starting device plugin registration server" May 06 17:10:26.856380 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.856366 2578 eviction_manager.go:189] "Eviction manager: starting control loop" May 06 17:10:26.856433 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.856380 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 06 17:10:26.856483 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.856456 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" May 06 17:10:26.856558 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.856546 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" May 06 17:10:26.856643 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.856559 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 06 17:10:26.857182 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:26.857161 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" May 06 17:10:26.857258 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:26.857202 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:26.878113 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.878091 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 06 17:10:26.879213 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.879198 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 06 17:10:26.879290 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.879225 2578 status_manager.go:230] "Starting to sync pod status with apiserver" May 06 17:10:26.879290 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.879244 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 06 17:10:26.879290 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.879255 2578 kubelet.go:2451] "Starting kubelet main sync loop" May 06 17:10:26.879426 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:26.879291 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" May 06 17:10:26.882523 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.882507 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" May 06 17:10:26.956516 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.956457 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 06 17:10:26.957739 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.957722 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasSufficientMemory" May 06 17:10:26.957814 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.957752 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasNoDiskPressure" May 06 17:10:26.957814 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.957763 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasSufficientPID" May 06 17:10:26.957814 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.957786 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-115.ec2.internal" May 06 17:10:26.965193 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.965178 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-115.ec2.internal" May 06 17:10:26.965271 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:26.965199 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-115.ec2.internal\": node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:26.975943 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:26.975928 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:26.980269 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.980252 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-115.ec2.internal"] May 06 17:10:26.980341 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.980323 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 06 17:10:26.981128 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.981112 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasSufficientMemory" May 06 17:10:26.981210 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.981143 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasNoDiskPressure" May 06 17:10:26.981210 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.981157 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasSufficientPID" May 06 17:10:26.982430 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.982412 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 06 17:10:26.982568 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.982553 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal" May 06 17:10:26.982634 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.982601 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 06 17:10:26.983037 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.983022 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasSufficientMemory" May 06 17:10:26.983123 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.983025 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasSufficientMemory" May 06 17:10:26.983123 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.983054 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasNoDiskPressure" May 06 17:10:26.983123 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.983068 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasSufficientPID" May 06 17:10:26.983123 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.983076 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasNoDiskPressure" May 06 17:10:26.983123 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.983088 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasSufficientPID" May 06 17:10:26.984142 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.984129 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-115.ec2.internal" May 06 17:10:26.984194 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.984151 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 06 17:10:26.984993 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.984979 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasSufficientMemory" May 06 17:10:26.985043 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.984999 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasNoDiskPressure" May 06 17:10:26.985043 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:26.985007 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeHasSufficientPID" May 06 17:10:26.999138 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:26.999116 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-115.ec2.internal\" not found" node="ip-10-0-131-115.ec2.internal" May 06 17:10:27.003274 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:27.003258 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-115.ec2.internal\" not found" node="ip-10-0-131-115.ec2.internal" May 06 17:10:27.076850 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:27.076830 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:27.091729 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.091711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7dc8d95468b132949eea9f60f0257292-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal\" (UID: \"7dc8d95468b132949eea9f60f0257292\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal" May 06 17:10:27.091822 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.091735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7dc8d95468b132949eea9f60f0257292-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal\" (UID: \"7dc8d95468b132949eea9f60f0257292\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal" May 06 17:10:27.091822 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.091754 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1ee67fdc31db59363fd513d0a4d3384a-config\") pod \"kube-apiserver-proxy-ip-10-0-131-115.ec2.internal\" (UID: \"1ee67fdc31db59363fd513d0a4d3384a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-115.ec2.internal" May 06 17:10:27.177646 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:27.177621 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:27.191961 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.191933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7dc8d95468b132949eea9f60f0257292-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal\" (UID: \"7dc8d95468b132949eea9f60f0257292\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal" May 06 17:10:27.191961 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.191890 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7dc8d95468b132949eea9f60f0257292-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal\" (UID: \"7dc8d95468b132949eea9f60f0257292\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal" May 06 17:10:27.192108 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.191986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7dc8d95468b132949eea9f60f0257292-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal\" (UID: \"7dc8d95468b132949eea9f60f0257292\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal" May 06 17:10:27.192108 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.192005 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1ee67fdc31db59363fd513d0a4d3384a-config\") pod \"kube-apiserver-proxy-ip-10-0-131-115.ec2.internal\" (UID: \"1ee67fdc31db59363fd513d0a4d3384a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-115.ec2.internal" May 06 17:10:27.192108 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.192031 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1ee67fdc31db59363fd513d0a4d3384a-config\") pod \"kube-apiserver-proxy-ip-10-0-131-115.ec2.internal\" (UID: \"1ee67fdc31db59363fd513d0a4d3384a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-115.ec2.internal" May 06 17:10:27.192108 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.192063 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7dc8d95468b132949eea9f60f0257292-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal\" (UID: \"7dc8d95468b132949eea9f60f0257292\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal" May 06 17:10:27.278320 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:27.278265 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:27.301697 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.301682 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal" May 06 17:10:27.306211 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.306196 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-115.ec2.internal" May 06 17:10:27.379035 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:27.379006 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:27.479505 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:27.479471 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:27.580086 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:27.580021 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:27.680537 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:27.680507 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:27.703014 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.702988 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" May 06 17:10:27.703146 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.703130 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" May 06 17:10:27.703192 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.703162 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" May 06 17:10:27.781616 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:27.781570 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:27.789772 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.789750 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" May 06 17:10:27.793495 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.793469 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-05-05 17:05:26 +0000 UTC" deadline="2027-10-17 05:32:38.228365795 +0000 UTC" May 06 17:10:27.793495 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.793493 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12684h22m10.4348758s" May 06 17:10:27.804063 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.804042 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" May 06 17:10:27.813949 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:27.813921 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ee67fdc31db59363fd513d0a4d3384a.slice/crio-923b6ef33d1ee81254794b1390795b64081f94e6a95ebae9db81f856c3599fc8 WatchSource:0}: Error finding container 923b6ef33d1ee81254794b1390795b64081f94e6a95ebae9db81f856c3599fc8: Status 404 returned error can't find the container with id 923b6ef33d1ee81254794b1390795b64081f94e6a95ebae9db81f856c3599fc8 May 06 17:10:27.814155 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:27.814132 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dc8d95468b132949eea9f60f0257292.slice/crio-12ae894ecf9236b76ab419278bbe28a48f72e6dfbe89d870676fa396a7eb9945 WatchSource:0}: Error finding container 12ae894ecf9236b76ab419278bbe28a48f72e6dfbe89d870676fa396a7eb9945: Status 404 returned error can't find the container with id 12ae894ecf9236b76ab419278bbe28a48f72e6dfbe89d870676fa396a7eb9945 May 06 17:10:27.817447 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.817432 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 06 17:10:27.825506 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.825490 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6hp46" May 06 17:10:27.833347 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.833299 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6hp46" May 06 17:10:27.881665 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:27.881644 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:27.881785 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.881700 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal" event={"ID":"7dc8d95468b132949eea9f60f0257292","Type":"ContainerStarted","Data":"12ae894ecf9236b76ab419278bbe28a48f72e6dfbe89d870676fa396a7eb9945"} May 06 17:10:27.882630 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:27.882610 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-115.ec2.internal" event={"ID":"1ee67fdc31db59363fd513d0a4d3384a","Type":"ContainerStarted","Data":"923b6ef33d1ee81254794b1390795b64081f94e6a95ebae9db81f856c3599fc8"} May 06 17:10:27.982218 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:27.982036 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:28.082529 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:28.082496 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:28.183099 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:28.183024 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:28.283727 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:28.283697 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-115.ec2.internal\" not found" May 06 17:10:28.322646 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.322617 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" May 06 17:10:28.361658 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.361631 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" May 06 17:10:28.390423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.390399 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal" May 06 17:10:28.405258 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.405135 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 06 17:10:28.406553 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.406356 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-115.ec2.internal" May 06 17:10:28.416423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.416400 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 06 17:10:28.646739 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.646653 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" May 06 17:10:28.769674 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.769635 2578 apiserver.go:52] "Watching apiserver" May 06 17:10:28.778220 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.778192 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" May 06 17:10:28.778644 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.778623 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-h8tck","openshift-ovn-kubernetes/ovnkube-node-lz4g2","kube-system/kube-apiserver-proxy-ip-10-0-131-115.ec2.internal","openshift-cluster-node-tuning-operator/tuned-999z4","openshift-multus/multus-f7wdz","openshift-multus/network-metrics-daemon-4cnzp","kube-system/konnectivity-agent-mtlxr","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq","openshift-image-registry/node-ca-pwg45","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal","openshift-multus/multus-additional-cni-plugins-z6k67","openshift-network-diagnostics/network-check-target-k855j"] May 06 17:10:28.780209 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.780190 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h8tck" May 06 17:10:28.781463 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.781447 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.782701 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.782682 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f7wdz" May 06 17:10:28.783903 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.783886 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mtlxr" May 06 17:10:28.785336 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.785296 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.786985 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.786965 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:28.787091 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:28.787036 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:28.788476 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.788442 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.789856 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.789842 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pwg45" May 06 17:10:28.791153 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.791114 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.792402 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.792380 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:28.792480 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:28.792437 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:28.796631 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.796614 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" May 06 17:10:28.796721 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.796619 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-wz2vr\"" May 06 17:10:28.800645 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.800626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-device-dir\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.800744 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.800656 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-multus-conf-dir\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.800744 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.800678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-kubelet\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.800744 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.800694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-var-lib-openvswitch\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.800895 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.800755 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-lib-modules\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.800895 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.800818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.800895 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.800850 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxg7v\" (UniqueName: \"kubernetes.io/projected/a16518eb-5b61-43e2-b86f-2f0f8bda5d9a-kube-api-access-cxg7v\") pod \"iptables-alerter-h8tck\" (UID: \"a16518eb-5b61-43e2-b86f-2f0f8bda5d9a\") " pod="openshift-network-operator/iptables-alerter-h8tck" May 06 17:10:28.800895 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.800873 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-run-netns\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.801091 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.800903 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-cni-bin\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.801091 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.800924 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-cni-netd\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.801091 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.800942 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-system-cni-dir\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.801091 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.800956 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-etc-kubernetes\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.801091 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.800976 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-node-log\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.801091 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801007 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9225285c-2bd9-49db-af85-96b0a3f45d5a-serviceca\") pod \"node-ca-pwg45\" (UID: \"9225285c-2bd9-49db-af85-96b0a3f45d5a\") " pod="openshift-image-registry/node-ca-pwg45" May 06 17:10:28.801091 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801032 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.801398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801101 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-sysctl-d\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.801398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801124 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-var-lib-cni-bin\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.801398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801161 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-hostroot\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.801398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801198 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cbdbc6d6-c384-4259-8a3b-f40e37586a30-multus-daemon-config\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.801398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801239 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-kubernetes\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.801398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801278 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cbdbc6d6-c384-4259-8a3b-f40e37586a30-cni-binary-copy\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.801398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801316 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-run-ovn\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.801398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801336 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9225285c-2bd9-49db-af85-96b0a3f45d5a-host\") pod \"node-ca-pwg45\" (UID: \"9225285c-2bd9-49db-af85-96b0a3f45d5a\") " pod="openshift-image-registry/node-ca-pwg45" May 06 17:10:28.801398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801359 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-sysctl-conf\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.801398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801380 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-tmp\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.801811 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801405 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c7ac35c9-4cf5-489f-9fd7-4e950edc1678-agent-certs\") pod \"konnectivity-agent-mtlxr\" (UID: \"c7ac35c9-4cf5-489f-9fd7-4e950edc1678\") " pod="kube-system/konnectivity-agent-mtlxr" May 06 17:10:28.801811 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-run-systemd\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.801811 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801479 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67d17f57-c01f-4b15-8664-18245c28cece-ovnkube-config\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.801811 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801508 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.801811 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.801547 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-modprobe-d\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.802041 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.802020 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-systemd\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.802120 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.802106 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-host\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.802178 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.802143 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vflkv\" (UniqueName: \"kubernetes.io/projected/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-kube-api-access-vflkv\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.802232 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.802194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-run-k8s-cni-cncf-io\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.802287 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.802232 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-os-release\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.802335 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.802279 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.802393 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.802351 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a16518eb-5b61-43e2-b86f-2f0f8bda5d9a-iptables-alerter-script\") pod \"iptables-alerter-h8tck\" (UID: \"a16518eb-5b61-43e2-b86f-2f0f8bda5d9a\") " pod="openshift-network-operator/iptables-alerter-h8tck" May 06 17:10:28.802465 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.802448 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-socket-dir\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.802519 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.802503 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldrtp\" (UniqueName: \"kubernetes.io/projected/cbdbc6d6-c384-4259-8a3b-f40e37586a30-kube-api-access-ldrtp\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.802851 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.802634 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-cnibin\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.802851 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.802746 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-cni-binary-copy\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.802992 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.802879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-run-openvswitch\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.802992 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.802945 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-run\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.803091 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803022 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs\") pod \"network-metrics-daemon-4cnzp\" (UID: \"e0f197dc-dd5c-4f23-ad0d-f076fc70415f\") " pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:28.803091 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-registration-dir\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.803172 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803103 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-var-lib-kubelet\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.803172 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-systemd-units\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.803262 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67d17f57-c01f-4b15-8664-18245c28cece-ovn-node-metrics-cert\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.803262 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803248 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-tuned\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.803352 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803308 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-sys-fs\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.803398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803382 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6tkd\" (UniqueName: \"kubernetes.io/projected/0a374dc8-39cb-403e-a34c-6d0d7185b09a-kube-api-access-r6tkd\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.803468 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803432 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-cnibin\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.803567 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803489 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-run-netns\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.803567 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-log-socket\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.803684 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803622 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/67d17f57-c01f-4b15-8664-18245c28cece-ovnkube-script-lib\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.803731 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803696 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b9kb\" (UniqueName: \"kubernetes.io/projected/67d17f57-c01f-4b15-8664-18245c28cece-kube-api-access-4b9kb\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.803789 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803749 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-sysconfig\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.803840 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803807 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-etc-selinux\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.803887 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803843 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-multus-socket-dir-parent\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.803887 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803871 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-run-multus-certs\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.803995 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803895 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-run-ovn-kubernetes\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.803995 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.803936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67d17f57-c01f-4b15-8664-18245c28cece-env-overrides\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.804103 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804025 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-var-lib-kubelet\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.804103 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804060 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c7ac35c9-4cf5-489f-9fd7-4e950edc1678-konnectivity-ca\") pod \"konnectivity-agent-mtlxr\" (UID: \"c7ac35c9-4cf5-489f-9fd7-4e950edc1678\") " pod="kube-system/konnectivity-agent-mtlxr" May 06 17:10:28.804205 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-os-release\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.804253 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804225 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-var-lib-cni-multus\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.804338 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48h94\" (UniqueName: \"kubernetes.io/projected/9225285c-2bd9-49db-af85-96b0a3f45d5a-kube-api-access-48h94\") pod \"node-ca-pwg45\" (UID: \"9225285c-2bd9-49db-af85-96b0a3f45d5a\") " pod="openshift-image-registry/node-ca-pwg45" May 06 17:10:28.804409 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804381 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plv57\" (UniqueName: \"kubernetes.io/projected/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-kube-api-access-plv57\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.804462 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804433 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-sys\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.804539 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804485 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mst4\" (UniqueName: \"kubernetes.io/projected/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-kube-api-access-2mst4\") pod \"network-metrics-daemon-4cnzp\" (UID: \"e0f197dc-dd5c-4f23-ad0d-f076fc70415f\") " pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:28.804539 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-multus-cni-dir\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.804539 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804532 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" May 06 17:10:28.804708 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804544 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-slash\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.804708 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804607 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-etc-openvswitch\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.804708 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804655 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-system-cni-dir\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.804858 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a16518eb-5b61-43e2-b86f-2f0f8bda5d9a-host-slash\") pod \"iptables-alerter-h8tck\" (UID: \"a16518eb-5b61-43e2-b86f-2f0f8bda5d9a\") " pod="openshift-network-operator/iptables-alerter-h8tck" May 06 17:10:28.804932 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.804916 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.806190 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.806170 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" May 06 17:10:28.806190 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.806184 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" May 06 17:10:28.806329 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.806192 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" May 06 17:10:28.806329 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.806273 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" May 06 17:10:28.806329 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.806187 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" May 06 17:10:28.806492 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.806469 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" May 06 17:10:28.806565 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.806548 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wnmdq\"" May 06 17:10:28.806668 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.806649 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" May 06 17:10:28.806725 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.806676 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" May 06 17:10:28.806807 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.806788 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" May 06 17:10:28.806861 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.806809 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" May 06 17:10:28.806911 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.806796 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mdbrn\"" May 06 17:10:28.806911 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.806895 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" May 06 17:10:28.807052 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.807033 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" May 06 17:10:28.807104 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.807083 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" May 06 17:10:28.807320 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.807300 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" May 06 17:10:28.807320 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.807327 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" May 06 17:10:28.807320 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.807379 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" May 06 17:10:28.807836 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.807813 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xf56v\"" May 06 17:10:28.807943 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.807867 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" May 06 17:10:28.807943 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.807899 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2twxn\"" May 06 17:10:28.807943 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.807819 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h7f27\"" May 06 17:10:28.808804 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.808787 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" May 06 17:10:28.808911 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.808897 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" May 06 17:10:28.808973 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.808789 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" May 06 17:10:28.809065 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.809045 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" May 06 17:10:28.809144 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.809102 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-dfgww\"" May 06 17:10:28.809269 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.809243 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" May 06 17:10:28.809269 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.809253 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" May 06 17:10:28.809375 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.809268 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-pn98x\"" May 06 17:10:28.809574 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.809554 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" May 06 17:10:28.834174 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.834152 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-05-05 17:05:27 +0000 UTC" deadline="2027-12-24 20:32:19.747464603 +0000 UTC" May 06 17:10:28.834174 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.834172 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14331h21m50.91329541s" May 06 17:10:28.892533 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.892510 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 06 17:10:28.905537 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.905537 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905500 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-device-dir\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.905537 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905525 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-multus-conf-dir\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.905730 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905568 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-kubelet\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.905730 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905578 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-multus-conf-dir\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.905730 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905578 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.905730 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-var-lib-openvswitch\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.905730 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-kubelet\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.905730 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-lib-modules\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.905730 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905672 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-device-dir\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.905730 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-var-lib-openvswitch\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.905730 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905704 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxg7v\" (UniqueName: \"kubernetes.io/projected/a16518eb-5b61-43e2-b86f-2f0f8bda5d9a-kube-api-access-cxg7v\") pod \"iptables-alerter-h8tck\" (UID: \"a16518eb-5b61-43e2-b86f-2f0f8bda5d9a\") " pod="openshift-network-operator/iptables-alerter-h8tck" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905765 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-run-netns\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905783 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-lib-modules\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-cni-bin\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905804 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-cni-netd\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-system-cni-dir\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905866 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-cni-netd\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-etc-kubernetes\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905924 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-run-netns\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-node-log\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905952 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9225285c-2bd9-49db-af85-96b0a3f45d5a-serviceca\") pod \"node-ca-pwg45\" (UID: \"9225285c-2bd9-49db-af85-96b0a3f45d5a\") " pod="openshift-image-registry/node-ca-pwg45" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905976 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-cni-bin\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.905985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-sysctl-d\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.906019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-etc-kubernetes\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906033 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-var-lib-cni-bin\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906060 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-hostroot\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906064 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-system-cni-dir\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cbdbc6d6-c384-4259-8a3b-f40e37586a30-multus-daemon-config\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906107 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-kubernetes\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cbdbc6d6-c384-4259-8a3b-f40e37586a30-cni-binary-copy\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906159 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-var-lib-cni-bin\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906165 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-run-ovn\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906178 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-sysctl-d\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906206 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-run-ovn\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9225285c-2bd9-49db-af85-96b0a3f45d5a-host\") pod \"node-ca-pwg45\" (UID: \"9225285c-2bd9-49db-af85-96b0a3f45d5a\") " pod="openshift-image-registry/node-ca-pwg45" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-sysctl-conf\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906274 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-node-log\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906343 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906385 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-hostroot\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906404 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-kubernetes\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906513 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-sysctl-conf\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.906759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9225285c-2bd9-49db-af85-96b0a3f45d5a-host\") pod \"node-ca-pwg45\" (UID: \"9225285c-2bd9-49db-af85-96b0a3f45d5a\") " pod="openshift-image-registry/node-ca-pwg45" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9225285c-2bd9-49db-af85-96b0a3f45d5a-serviceca\") pod \"node-ca-pwg45\" (UID: \"9225285c-2bd9-49db-af85-96b0a3f45d5a\") " pod="openshift-image-registry/node-ca-pwg45" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906852 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cbdbc6d6-c384-4259-8a3b-f40e37586a30-cni-binary-copy\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906901 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906436 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-tmp\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906963 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c7ac35c9-4cf5-489f-9fd7-4e950edc1678-agent-certs\") pod \"konnectivity-agent-mtlxr\" (UID: \"c7ac35c9-4cf5-489f-9fd7-4e950edc1678\") " pod="kube-system/konnectivity-agent-mtlxr" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.906993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-run-systemd\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907018 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67d17f57-c01f-4b15-8664-18245c28cece-ovnkube-config\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907046 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-modprobe-d\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907104 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-systemd\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907129 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-host\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907152 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vflkv\" (UniqueName: \"kubernetes.io/projected/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-kube-api-access-vflkv\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907179 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-run-k8s-cni-cncf-io\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907205 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-os-release\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907230 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907252 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-systemd\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907257 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a16518eb-5b61-43e2-b86f-2f0f8bda5d9a-iptables-alerter-script\") pod \"iptables-alerter-h8tck\" (UID: \"a16518eb-5b61-43e2-b86f-2f0f8bda5d9a\") " pod="openshift-network-operator/iptables-alerter-h8tck" May 06 17:10:28.907578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907309 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-socket-dir\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-host\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldrtp\" (UniqueName: \"kubernetes.io/projected/cbdbc6d6-c384-4259-8a3b-f40e37586a30-kube-api-access-ldrtp\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907351 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cbdbc6d6-c384-4259-8a3b-f40e37586a30-multus-daemon-config\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-cnibin\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907395 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-cni-binary-copy\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907431 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-modprobe-d\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907474 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-run-openvswitch\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907479 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-run-k8s-cni-cncf-io\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907439 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-run-openvswitch\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907527 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-run\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs\") pod \"network-metrics-daemon-4cnzp\" (UID: \"e0f197dc-dd5c-4f23-ad0d-f076fc70415f\") " pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-socket-dir\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907607 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-registration-dir\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-var-lib-kubelet\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907666 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-systemd-units\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67d17f57-c01f-4b15-8664-18245c28cece-ovn-node-metrics-cert\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.908346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907718 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-tuned\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-sys-fs\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6tkd\" (UniqueName: \"kubernetes.io/projected/0a374dc8-39cb-403e-a34c-6d0d7185b09a-kube-api-access-r6tkd\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-cnibin\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907796 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a16518eb-5b61-43e2-b86f-2f0f8bda5d9a-iptables-alerter-script\") pod \"iptables-alerter-h8tck\" (UID: \"a16518eb-5b61-43e2-b86f-2f0f8bda5d9a\") " pod="openshift-network-operator/iptables-alerter-h8tck" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907826 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-run-netns\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907864 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-log-socket\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907876 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-run-netns\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907892 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/67d17f57-c01f-4b15-8664-18245c28cece-ovnkube-script-lib\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907895 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-cnibin\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907925 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4b9kb\" (UniqueName: \"kubernetes.io/projected/67d17f57-c01f-4b15-8664-18245c28cece-kube-api-access-4b9kb\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907932 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-run\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907953 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-sysconfig\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907980 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-etc-selinux\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908007 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-multus-socket-dir-parent\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-var-lib-kubelet\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908032 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-run-multus-certs\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.908988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908077 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-systemd-units\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908090 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-run-multus-certs\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-run-ovn-kubernetes\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67d17f57-c01f-4b15-8664-18245c28cece-env-overrides\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908160 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-var-lib-kubelet\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908187 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c7ac35c9-4cf5-489f-9fd7-4e950edc1678-konnectivity-ca\") pod \"konnectivity-agent-mtlxr\" (UID: \"c7ac35c9-4cf5-489f-9fd7-4e950edc1678\") " pod="kube-system/konnectivity-agent-mtlxr" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-os-release\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67d17f57-c01f-4b15-8664-18245c28cece-ovnkube-config\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908257 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-var-lib-cni-multus\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48h94\" (UniqueName: \"kubernetes.io/projected/9225285c-2bd9-49db-af85-96b0a3f45d5a-kube-api-access-48h94\") pod \"node-ca-pwg45\" (UID: \"9225285c-2bd9-49db-af85-96b0a3f45d5a\") " pod="openshift-image-registry/node-ca-pwg45" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908289 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-run-systemd\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908312 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29mkk\" (UniqueName: \"kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk\") pod \"network-check-target-k855j\" (UID: \"c55e3ee0-9cb2-41dc-9751-75fb3367d774\") " pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908346 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plv57\" (UniqueName: \"kubernetes.io/projected/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-kube-api-access-plv57\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908372 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-sys\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908419 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mst4\" (UniqueName: \"kubernetes.io/projected/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-kube-api-access-2mst4\") pod \"network-metrics-daemon-4cnzp\" (UID: \"e0f197dc-dd5c-4f23-ad0d-f076fc70415f\") " pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-multus-cni-dir\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-slash\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.909798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908528 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-etc-openvswitch\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-system-cni-dir\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a16518eb-5b61-43e2-b86f-2f0f8bda5d9a-host-slash\") pod \"iptables-alerter-h8tck\" (UID: \"a16518eb-5b61-43e2-b86f-2f0f8bda5d9a\") " pod="openshift-network-operator/iptables-alerter-h8tck" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908635 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-log-socket\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908676 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a16518eb-5b61-43e2-b86f-2f0f8bda5d9a-host-slash\") pod \"iptables-alerter-h8tck\" (UID: \"a16518eb-5b61-43e2-b86f-2f0f8bda5d9a\") " pod="openshift-network-operator/iptables-alerter-h8tck" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.908723 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-os-release\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-os-release\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909077 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-sys\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909122 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-host-var-lib-cni-multus\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-sysconfig\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909256 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-etc-selinux\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.907979 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-registration-dir\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909317 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-multus-socket-dir-parent\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-cni-binary-copy\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909482 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0a374dc8-39cb-403e-a34c-6d0d7185b09a-sys-fs\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:28.909414 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909578 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-slash\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.910507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909676 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-cnibin\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.911267 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909817 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-etc-openvswitch\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.911267 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909834 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbdbc6d6-c384-4259-8a3b-f40e37586a30-multus-cni-dir\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.911267 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909880 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-system-cni-dir\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.911267 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909894 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/67d17f57-c01f-4b15-8664-18245c28cece-ovnkube-script-lib\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.911267 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909897 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67d17f57-c01f-4b15-8664-18245c28cece-host-run-ovn-kubernetes\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.911267 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.909900 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-var-lib-kubelet\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.911267 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:28.909578 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs podName:e0f197dc-dd5c-4f23-ad0d-f076fc70415f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:29.409545307 +0000 UTC m=+3.050942899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs") pod "network-metrics-daemon-4cnzp" (UID: "e0f197dc-dd5c-4f23-ad0d-f076fc70415f") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:28.911267 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.911100 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.911267 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.911133 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67d17f57-c01f-4b15-8664-18245c28cece-env-overrides\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.911693 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.911316 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-tmp\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.911693 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.911548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c7ac35c9-4cf5-489f-9fd7-4e950edc1678-agent-certs\") pod \"konnectivity-agent-mtlxr\" (UID: \"c7ac35c9-4cf5-489f-9fd7-4e950edc1678\") " pod="kube-system/konnectivity-agent-mtlxr" May 06 17:10:28.912358 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.912337 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c7ac35c9-4cf5-489f-9fd7-4e950edc1678-konnectivity-ca\") pod \"konnectivity-agent-mtlxr\" (UID: \"c7ac35c9-4cf5-489f-9fd7-4e950edc1678\") " pod="kube-system/konnectivity-agent-mtlxr" May 06 17:10:28.912884 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.912862 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-etc-tuned\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:28.915342 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.915319 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67d17f57-c01f-4b15-8664-18245c28cece-ovn-node-metrics-cert\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.919804 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.919781 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b9kb\" (UniqueName: \"kubernetes.io/projected/67d17f57-c01f-4b15-8664-18245c28cece-kube-api-access-4b9kb\") pod \"ovnkube-node-lz4g2\" (UID: \"67d17f57-c01f-4b15-8664-18245c28cece\") " pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:28.919939 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.919911 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxg7v\" (UniqueName: \"kubernetes.io/projected/a16518eb-5b61-43e2-b86f-2f0f8bda5d9a-kube-api-access-cxg7v\") pod \"iptables-alerter-h8tck\" (UID: \"a16518eb-5b61-43e2-b86f-2f0f8bda5d9a\") " pod="openshift-network-operator/iptables-alerter-h8tck" May 06 17:10:28.922397 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.922349 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48h94\" (UniqueName: \"kubernetes.io/projected/9225285c-2bd9-49db-af85-96b0a3f45d5a-kube-api-access-48h94\") pod \"node-ca-pwg45\" (UID: \"9225285c-2bd9-49db-af85-96b0a3f45d5a\") " pod="openshift-image-registry/node-ca-pwg45" May 06 17:10:28.922569 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.922545 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldrtp\" (UniqueName: \"kubernetes.io/projected/cbdbc6d6-c384-4259-8a3b-f40e37586a30-kube-api-access-ldrtp\") pod \"multus-f7wdz\" (UID: \"cbdbc6d6-c384-4259-8a3b-f40e37586a30\") " pod="openshift-multus/multus-f7wdz" May 06 17:10:28.922695 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.922577 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mst4\" (UniqueName: \"kubernetes.io/projected/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-kube-api-access-2mst4\") pod \"network-metrics-daemon-4cnzp\" (UID: \"e0f197dc-dd5c-4f23-ad0d-f076fc70415f\") " pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:28.923228 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.923198 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6tkd\" (UniqueName: \"kubernetes.io/projected/0a374dc8-39cb-403e-a34c-6d0d7185b09a-kube-api-access-r6tkd\") pod \"aws-ebs-csi-driver-node-tt8fq\" (UID: \"0a374dc8-39cb-403e-a34c-6d0d7185b09a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:28.924698 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.924669 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plv57\" (UniqueName: \"kubernetes.io/projected/ae7c9848-2f33-464c-a9e2-b97ba5c1b57c-kube-api-access-plv57\") pod \"multus-additional-cni-plugins-z6k67\" (UID: \"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c\") " pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:28.924799 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:28.924778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vflkv\" (UniqueName: \"kubernetes.io/projected/3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93-kube-api-access-vflkv\") pod \"tuned-999z4\" (UID: \"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93\") " pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:29.008986 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.008928 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29mkk\" (UniqueName: \"kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk\") pod \"network-check-target-k855j\" (UID: \"c55e3ee0-9cb2-41dc-9751-75fb3367d774\") " pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:29.014940 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:29.014918 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 06 17:10:29.014940 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:29.014939 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 06 17:10:29.015104 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:29.014949 2578 projected.go:194] Error preparing data for projected volume kube-api-access-29mkk for pod openshift-network-diagnostics/network-check-target-k855j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:29.015104 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:29.015016 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk podName:c55e3ee0-9cb2-41dc-9751-75fb3367d774 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:29.515001115 +0000 UTC m=+3.156398699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-29mkk" (UniqueName: "kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk") pod "network-check-target-k855j" (UID: "c55e3ee0-9cb2-41dc-9751-75fb3367d774") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:29.092960 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.092922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h8tck" May 06 17:10:29.100729 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.100705 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-999z4" May 06 17:10:29.112453 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.112427 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f7wdz" May 06 17:10:29.116085 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.116063 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mtlxr" May 06 17:10:29.122736 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.122714 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:29.128313 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.128294 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" May 06 17:10:29.134854 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.134836 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pwg45" May 06 17:10:29.139396 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.139375 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z6k67" May 06 17:10:29.322041 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.321917 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wvb5c"] May 06 17:10:29.323925 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.323904 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:29.324046 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:29.323981 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:29.412162 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.412125 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:29.412338 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.412171 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8a7c9e10-c274-46eb-b020-deee09868a53-kubelet-config\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:29.412338 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.412242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs\") pod \"network-metrics-daemon-4cnzp\" (UID: \"e0f197dc-dd5c-4f23-ad0d-f076fc70415f\") " pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:29.412338 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.412320 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8a7c9e10-c274-46eb-b020-deee09868a53-dbus\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:29.412338 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:29.412337 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:29.412509 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:29.412398 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs podName:e0f197dc-dd5c-4f23-ad0d-f076fc70415f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:30.412373956 +0000 UTC m=+4.053771545 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs") pod "network-metrics-daemon-4cnzp" (UID: "e0f197dc-dd5c-4f23-ad0d-f076fc70415f") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:29.510935 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:29.510907 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67d17f57_c01f_4b15_8664_18245c28cece.slice/crio-d165ba1f156f75b77a992fcea71849759aa9172b848adc28757722698bc1be6c WatchSource:0}: Error finding container d165ba1f156f75b77a992fcea71849759aa9172b848adc28757722698bc1be6c: Status 404 returned error can't find the container with id d165ba1f156f75b77a992fcea71849759aa9172b848adc28757722698bc1be6c May 06 17:10:29.511876 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:29.511850 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7ac35c9_4cf5_489f_9fd7_4e950edc1678.slice/crio-ead8643f07bd716c4f4daa62ea013812fbcb17f1d8c0176bed755a18453f266a WatchSource:0}: Error finding container ead8643f07bd716c4f4daa62ea013812fbcb17f1d8c0176bed755a18453f266a: Status 404 returned error can't find the container with id ead8643f07bd716c4f4daa62ea013812fbcb17f1d8c0176bed755a18453f266a May 06 17:10:29.512861 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.512755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:29.512861 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.512800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8a7c9e10-c274-46eb-b020-deee09868a53-kubelet-config\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:29.512990 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:29.512903 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 06 17:10:29.512990 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:29.512963 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret podName:8a7c9e10-c274-46eb-b020-deee09868a53 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:30.0129394 +0000 UTC m=+3.654336995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret") pod "global-pull-secret-syncer-wvb5c" (UID: "8a7c9e10-c274-46eb-b020-deee09868a53") : object "kube-system"/"original-pull-secret" not registered May 06 17:10:29.513143 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.513006 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8a7c9e10-c274-46eb-b020-deee09868a53-kubelet-config\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:29.513143 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.513013 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8a7c9e10-c274-46eb-b020-deee09868a53-dbus\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:29.513378 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.513142 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8a7c9e10-c274-46eb-b020-deee09868a53-dbus\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:29.513478 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:29.513440 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a374dc8_39cb_403e_a34c_6d0d7185b09a.slice/crio-e511bdbe5700e97e9827489865a171afe8a08db7a77cdd553a632426057d2a8d WatchSource:0}: Error finding container e511bdbe5700e97e9827489865a171afe8a08db7a77cdd553a632426057d2a8d: Status 404 returned error can't find the container with id e511bdbe5700e97e9827489865a171afe8a08db7a77cdd553a632426057d2a8d May 06 17:10:29.514620 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:29.514547 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae7c9848_2f33_464c_a9e2_b97ba5c1b57c.slice/crio-bdd009bd4e2e56cc159c8875a4047498d1a6baf8c05c2dfea73a0998ae299ec4 WatchSource:0}: Error finding container bdd009bd4e2e56cc159c8875a4047498d1a6baf8c05c2dfea73a0998ae299ec4: Status 404 returned error can't find the container with id bdd009bd4e2e56cc159c8875a4047498d1a6baf8c05c2dfea73a0998ae299ec4 May 06 17:10:29.614138 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.614111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29mkk\" (UniqueName: \"kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk\") pod \"network-check-target-k855j\" (UID: \"c55e3ee0-9cb2-41dc-9751-75fb3367d774\") " pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:29.614267 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:29.614250 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 06 17:10:29.614311 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:29.614271 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 06 17:10:29.614311 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:29.614280 2578 projected.go:194] Error preparing data for projected volume kube-api-access-29mkk for pod openshift-network-diagnostics/network-check-target-k855j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:29.614378 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:29.614321 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk podName:c55e3ee0-9cb2-41dc-9751-75fb3367d774 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:30.61430762 +0000 UTC m=+4.255705197 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-29mkk" (UniqueName: "kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk") pod "network-check-target-k855j" (UID: "c55e3ee0-9cb2-41dc-9751-75fb3367d774") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:29.835153 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.835086 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-05-05 17:05:27 +0000 UTC" deadline="2027-10-05 14:15:36.758805723 +0000 UTC" May 06 17:10:29.835153 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.835115 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12405h5m6.923693322s" May 06 17:10:29.886437 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.886404 2578 generic.go:358] "Generic (PLEG): container finished" podID="7dc8d95468b132949eea9f60f0257292" containerID="4dd09d4a1d35b9e0d57395bfb3bc838e5ffe571e75569be51ffe9c46e77b1149" exitCode=0 May 06 17:10:29.886560 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.886458 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal" event={"ID":"7dc8d95468b132949eea9f60f0257292","Type":"ContainerDied","Data":"4dd09d4a1d35b9e0d57395bfb3bc838e5ffe571e75569be51ffe9c46e77b1149"} May 06 17:10:29.889291 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.889239 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-999z4" event={"ID":"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93","Type":"ContainerStarted","Data":"3667ebb080120bff5a756fbb3912801df4bbbed2944584cfde60903ef5fc2aac"} May 06 17:10:29.891729 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.891694 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f7wdz" event={"ID":"cbdbc6d6-c384-4259-8a3b-f40e37586a30","Type":"ContainerStarted","Data":"e3ca1f04b126f84c874a312263982754970b5278076c9cc90f4bf6f2ffa1c1af"} May 06 17:10:29.893677 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.893649 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pwg45" event={"ID":"9225285c-2bd9-49db-af85-96b0a3f45d5a","Type":"ContainerStarted","Data":"36717f82e828a8637d76f4b2717c0fd23e834b58e07d78dde6540ef8e87f34f9"} May 06 17:10:29.894978 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.894941 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" event={"ID":"0a374dc8-39cb-403e-a34c-6d0d7185b09a","Type":"ContainerStarted","Data":"e511bdbe5700e97e9827489865a171afe8a08db7a77cdd553a632426057d2a8d"} May 06 17:10:29.896470 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.896340 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mtlxr" event={"ID":"c7ac35c9-4cf5-489f-9fd7-4e950edc1678","Type":"ContainerStarted","Data":"ead8643f07bd716c4f4daa62ea013812fbcb17f1d8c0176bed755a18453f266a"} May 06 17:10:29.898388 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.898093 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-115.ec2.internal" event={"ID":"1ee67fdc31db59363fd513d0a4d3384a","Type":"ContainerStarted","Data":"1f1f87d927fbe2c8b57b816c1959303a310bcdbcf8426c6838b9db1e5d32610c"} May 06 17:10:29.899576 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.899552 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h8tck" event={"ID":"a16518eb-5b61-43e2-b86f-2f0f8bda5d9a","Type":"ContainerStarted","Data":"b399dbca3817c30d9b6556592a7662b2b912a71c4813813c4e4efc6f5d86f2c1"} May 06 17:10:29.900850 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.900817 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6k67" event={"ID":"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c","Type":"ContainerStarted","Data":"bdd009bd4e2e56cc159c8875a4047498d1a6baf8c05c2dfea73a0998ae299ec4"} May 06 17:10:29.902073 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:29.902054 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" event={"ID":"67d17f57-c01f-4b15-8664-18245c28cece","Type":"ContainerStarted","Data":"d165ba1f156f75b77a992fcea71849759aa9172b848adc28757722698bc1be6c"} May 06 17:10:30.018331 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:30.017812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:30.018331 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:30.017990 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 06 17:10:30.018331 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:30.018042 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret podName:8a7c9e10-c274-46eb-b020-deee09868a53 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:31.018024694 +0000 UTC m=+4.659422275 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret") pod "global-pull-secret-syncer-wvb5c" (UID: "8a7c9e10-c274-46eb-b020-deee09868a53") : object "kube-system"/"original-pull-secret" not registered May 06 17:10:30.422091 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:30.422057 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs\") pod \"network-metrics-daemon-4cnzp\" (UID: \"e0f197dc-dd5c-4f23-ad0d-f076fc70415f\") " pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:30.422259 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:30.422199 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:30.422335 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:30.422262 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs podName:e0f197dc-dd5c-4f23-ad0d-f076fc70415f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:32.422242207 +0000 UTC m=+6.063639788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs") pod "network-metrics-daemon-4cnzp" (UID: "e0f197dc-dd5c-4f23-ad0d-f076fc70415f") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:30.623967 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:30.623927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29mkk\" (UniqueName: \"kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk\") pod \"network-check-target-k855j\" (UID: \"c55e3ee0-9cb2-41dc-9751-75fb3367d774\") " pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:30.624151 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:30.624140 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 06 17:10:30.624210 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:30.624161 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 06 17:10:30.624210 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:30.624174 2578 projected.go:194] Error preparing data for projected volume kube-api-access-29mkk for pod openshift-network-diagnostics/network-check-target-k855j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:30.624301 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:30.624229 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk podName:c55e3ee0-9cb2-41dc-9751-75fb3367d774 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:32.624211052 +0000 UTC m=+6.265608633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-29mkk" (UniqueName: "kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk") pod "network-check-target-k855j" (UID: "c55e3ee0-9cb2-41dc-9751-75fb3367d774") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:30.880507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:30.879731 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:30.880507 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:30.879859 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:30.880507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:30.880291 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:30.880507 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:30.880395 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:30.881411 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:30.881242 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:30.881411 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:30.881334 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:30.914657 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:30.914623 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal" event={"ID":"7dc8d95468b132949eea9f60f0257292","Type":"ContainerStarted","Data":"2991d02b034f233ee0d7c56dc4ae642bee9c3b3a75af30944056532edaa6ef01"} May 06 17:10:30.945027 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:30.944974 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-115.ec2.internal" podStartSLOduration=2.944957052 podStartE2EDuration="2.944957052s" podCreationTimestamp="2026-05-06 17:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:10:29.913259696 +0000 UTC m=+3.554657295" watchObservedRunningTime="2026-05-06 17:10:30.944957052 +0000 UTC m=+4.586354651" May 06 17:10:31.027635 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:31.027593 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:31.027807 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:31.027774 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 06 17:10:31.027870 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:31.027834 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret podName:8a7c9e10-c274-46eb-b020-deee09868a53 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:33.027817138 +0000 UTC m=+6.669214719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret") pod "global-pull-secret-syncer-wvb5c" (UID: "8a7c9e10-c274-46eb-b020-deee09868a53") : object "kube-system"/"original-pull-secret" not registered May 06 17:10:32.440130 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:32.440045 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs\") pod \"network-metrics-daemon-4cnzp\" (UID: \"e0f197dc-dd5c-4f23-ad0d-f076fc70415f\") " pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:32.440561 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:32.440196 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:32.440561 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:32.440264 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs podName:e0f197dc-dd5c-4f23-ad0d-f076fc70415f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:36.440241223 +0000 UTC m=+10.081638815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs") pod "network-metrics-daemon-4cnzp" (UID: "e0f197dc-dd5c-4f23-ad0d-f076fc70415f") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:32.641357 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:32.641321 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29mkk\" (UniqueName: \"kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk\") pod \"network-check-target-k855j\" (UID: \"c55e3ee0-9cb2-41dc-9751-75fb3367d774\") " pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:32.641521 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:32.641500 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 06 17:10:32.641521 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:32.641518 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 06 17:10:32.641660 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:32.641531 2578 projected.go:194] Error preparing data for projected volume kube-api-access-29mkk for pod openshift-network-diagnostics/network-check-target-k855j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:32.641660 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:32.641599 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk podName:c55e3ee0-9cb2-41dc-9751-75fb3367d774 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:36.64156556 +0000 UTC m=+10.282963142 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-29mkk" (UniqueName: "kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk") pod "network-check-target-k855j" (UID: "c55e3ee0-9cb2-41dc-9751-75fb3367d774") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:32.881106 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:32.880526 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:32.881106 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:32.880675 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:32.881106 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:32.880685 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:32.881106 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:32.880713 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:32.881106 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:32.880764 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:32.881106 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:32.880842 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:33.045467 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:33.045429 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:33.045675 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:33.045616 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 06 17:10:33.045750 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:33.045707 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret podName:8a7c9e10-c274-46eb-b020-deee09868a53 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:37.045683964 +0000 UTC m=+10.687081556 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret") pod "global-pull-secret-syncer-wvb5c" (UID: "8a7c9e10-c274-46eb-b020-deee09868a53") : object "kube-system"/"original-pull-secret" not registered May 06 17:10:33.979376 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:33.979257 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-115.ec2.internal" podStartSLOduration=5.979238717 podStartE2EDuration="5.979238717s" podCreationTimestamp="2026-05-06 17:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:10:30.945529146 +0000 UTC m=+4.586926746" watchObservedRunningTime="2026-05-06 17:10:33.979238717 +0000 UTC m=+7.620636317" May 06 17:10:33.979852 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:33.979540 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4lbvf"] May 06 17:10:33.984291 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:33.983646 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4lbvf" May 06 17:10:33.986413 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:33.986390 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" May 06 17:10:33.986520 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:33.986394 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" May 06 17:10:33.988885 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:33.988865 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rvt4b\"" May 06 17:10:34.054761 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:34.054729 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d07b8d8f-4948-4e10-8402-db41f4c64242-hosts-file\") pod \"node-resolver-4lbvf\" (UID: \"d07b8d8f-4948-4e10-8402-db41f4c64242\") " pod="openshift-dns/node-resolver-4lbvf" May 06 17:10:34.054929 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:34.054770 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d07b8d8f-4948-4e10-8402-db41f4c64242-tmp-dir\") pod \"node-resolver-4lbvf\" (UID: \"d07b8d8f-4948-4e10-8402-db41f4c64242\") " pod="openshift-dns/node-resolver-4lbvf" May 06 17:10:34.054929 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:34.054804 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gk6h\" (UniqueName: \"kubernetes.io/projected/d07b8d8f-4948-4e10-8402-db41f4c64242-kube-api-access-7gk6h\") pod \"node-resolver-4lbvf\" (UID: \"d07b8d8f-4948-4e10-8402-db41f4c64242\") " pod="openshift-dns/node-resolver-4lbvf" May 06 17:10:34.156667 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:34.156237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d07b8d8f-4948-4e10-8402-db41f4c64242-hosts-file\") pod \"node-resolver-4lbvf\" (UID: \"d07b8d8f-4948-4e10-8402-db41f4c64242\") " pod="openshift-dns/node-resolver-4lbvf" May 06 17:10:34.156667 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:34.156282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d07b8d8f-4948-4e10-8402-db41f4c64242-tmp-dir\") pod \"node-resolver-4lbvf\" (UID: \"d07b8d8f-4948-4e10-8402-db41f4c64242\") " pod="openshift-dns/node-resolver-4lbvf" May 06 17:10:34.156667 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:34.156314 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gk6h\" (UniqueName: \"kubernetes.io/projected/d07b8d8f-4948-4e10-8402-db41f4c64242-kube-api-access-7gk6h\") pod \"node-resolver-4lbvf\" (UID: \"d07b8d8f-4948-4e10-8402-db41f4c64242\") " pod="openshift-dns/node-resolver-4lbvf" May 06 17:10:34.156667 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:34.156391 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d07b8d8f-4948-4e10-8402-db41f4c64242-hosts-file\") pod \"node-resolver-4lbvf\" (UID: \"d07b8d8f-4948-4e10-8402-db41f4c64242\") " pod="openshift-dns/node-resolver-4lbvf" May 06 17:10:34.156984 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:34.156681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d07b8d8f-4948-4e10-8402-db41f4c64242-tmp-dir\") pod \"node-resolver-4lbvf\" (UID: \"d07b8d8f-4948-4e10-8402-db41f4c64242\") " pod="openshift-dns/node-resolver-4lbvf" May 06 17:10:34.168036 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:34.167958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gk6h\" (UniqueName: \"kubernetes.io/projected/d07b8d8f-4948-4e10-8402-db41f4c64242-kube-api-access-7gk6h\") pod \"node-resolver-4lbvf\" (UID: \"d07b8d8f-4948-4e10-8402-db41f4c64242\") " pod="openshift-dns/node-resolver-4lbvf" May 06 17:10:34.297831 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:34.297749 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4lbvf" May 06 17:10:34.879999 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:34.879970 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:34.880176 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:34.880143 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:34.880329 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:34.880299 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:34.880432 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:34.880404 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:34.880533 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:34.880514 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:34.880662 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:34.880641 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:36.475596 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:36.474994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs\") pod \"network-metrics-daemon-4cnzp\" (UID: \"e0f197dc-dd5c-4f23-ad0d-f076fc70415f\") " pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:36.475596 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:36.475177 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:36.475596 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:36.475240 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs podName:e0f197dc-dd5c-4f23-ad0d-f076fc70415f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:44.475222 +0000 UTC m=+18.116619581 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs") pod "network-metrics-daemon-4cnzp" (UID: "e0f197dc-dd5c-4f23-ad0d-f076fc70415f") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:36.677368 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:36.677332 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29mkk\" (UniqueName: \"kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk\") pod \"network-check-target-k855j\" (UID: \"c55e3ee0-9cb2-41dc-9751-75fb3367d774\") " pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:36.677553 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:36.677534 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 06 17:10:36.677630 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:36.677562 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 06 17:10:36.677630 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:36.677575 2578 projected.go:194] Error preparing data for projected volume kube-api-access-29mkk for pod openshift-network-diagnostics/network-check-target-k855j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:36.677727 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:36.677651 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk podName:c55e3ee0-9cb2-41dc-9751-75fb3367d774 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:44.677632301 +0000 UTC m=+18.319029892 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-29mkk" (UniqueName: "kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk") pod "network-check-target-k855j" (UID: "c55e3ee0-9cb2-41dc-9751-75fb3367d774") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:36.880647 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:36.880622 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:36.880775 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:36.880692 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:36.880841 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:36.880785 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:36.880897 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:36.880819 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:36.881006 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:36.880884 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:36.881006 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:36.880980 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:37.079818 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:37.079780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:37.080000 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:37.079982 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 06 17:10:37.080074 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:37.080051 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret podName:8a7c9e10-c274-46eb-b020-deee09868a53 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:45.080031775 +0000 UTC m=+18.721429357 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret") pod "global-pull-secret-syncer-wvb5c" (UID: "8a7c9e10-c274-46eb-b020-deee09868a53") : object "kube-system"/"original-pull-secret" not registered May 06 17:10:38.879998 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:38.879963 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:38.879998 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:38.879986 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:38.880510 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:38.880107 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:38.880510 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:38.880205 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:38.880510 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:38.880262 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:38.880510 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:38.880338 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:40.879534 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:40.879490 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:40.879983 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:40.879502 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:40.879983 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:40.879652 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:40.879983 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:40.879490 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:40.879983 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:40.879747 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:40.879983 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:40.879856 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:42.880316 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:42.880279 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:42.880713 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:42.880392 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:42.880713 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:42.880411 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:42.880713 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:42.880463 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:42.880713 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:42.880474 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:42.880713 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:42.880560 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:44.534250 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:44.534215 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs\") pod \"network-metrics-daemon-4cnzp\" (UID: \"e0f197dc-dd5c-4f23-ad0d-f076fc70415f\") " pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:44.534700 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:44.534356 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:44.534700 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:44.534410 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs podName:e0f197dc-dd5c-4f23-ad0d-f076fc70415f nodeName:}" failed. No retries permitted until 2026-05-06 17:11:00.53439425 +0000 UTC m=+34.175791831 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs") pod "network-metrics-daemon-4cnzp" (UID: "e0f197dc-dd5c-4f23-ad0d-f076fc70415f") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:44.735142 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:44.735111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29mkk\" (UniqueName: \"kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk\") pod \"network-check-target-k855j\" (UID: \"c55e3ee0-9cb2-41dc-9751-75fb3367d774\") " pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:44.735311 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:44.735275 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 06 17:10:44.735311 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:44.735297 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 06 17:10:44.735311 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:44.735307 2578 projected.go:194] Error preparing data for projected volume kube-api-access-29mkk for pod openshift-network-diagnostics/network-check-target-k855j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:44.735418 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:44.735358 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk podName:c55e3ee0-9cb2-41dc-9751-75fb3367d774 nodeName:}" failed. No retries permitted until 2026-05-06 17:11:00.735343706 +0000 UTC m=+34.376741287 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-29mkk" (UniqueName: "kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk") pod "network-check-target-k855j" (UID: "c55e3ee0-9cb2-41dc-9751-75fb3367d774") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:44.879850 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:44.879760 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:44.880023 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:44.879760 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:44.880023 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:44.879882 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:44.880023 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:44.879972 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:44.880023 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:44.879760 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:44.880201 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:44.880060 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:45.138395 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:45.138314 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:45.138531 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:45.138438 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 06 17:10:45.138531 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:45.138499 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret podName:8a7c9e10-c274-46eb-b020-deee09868a53 nodeName:}" failed. No retries permitted until 2026-05-06 17:11:01.138479181 +0000 UTC m=+34.779876766 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret") pod "global-pull-secret-syncer-wvb5c" (UID: "8a7c9e10-c274-46eb-b020-deee09868a53") : object "kube-system"/"original-pull-secret" not registered May 06 17:10:46.880454 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:46.880419 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:46.880991 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:46.880505 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:46.880991 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:46.880545 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:46.880991 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:46.880554 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:46.880991 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:46.880629 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:46.881265 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:46.881219 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:47.340512 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:10:47.340478 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd07b8d8f_4948_4e10_8402_db41f4c64242.slice/crio-c116ca7736be8543bbf382c95fcca8c270b280b60fc6ae530dfc9d1d378faabb WatchSource:0}: Error finding container c116ca7736be8543bbf382c95fcca8c270b280b60fc6ae530dfc9d1d378faabb: Status 404 returned error can't find the container with id c116ca7736be8543bbf382c95fcca8c270b280b60fc6ae530dfc9d1d378faabb May 06 17:10:47.946701 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.946432 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae7c9848-2f33-464c-a9e2-b97ba5c1b57c" containerID="0aa73421a282a5b74b4ab8f2a339961747506715d95b80739e656520e739d520" exitCode=0 May 06 17:10:47.947493 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.946509 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6k67" event={"ID":"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c","Type":"ContainerDied","Data":"0aa73421a282a5b74b4ab8f2a339961747506715d95b80739e656520e739d520"} May 06 17:10:47.950231 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.950202 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" event={"ID":"67d17f57-c01f-4b15-8664-18245c28cece","Type":"ContainerStarted","Data":"4f09e3319e6e91d3877cea220ecf92a6dde41fbf273d4a821a9699b6d6ac1b98"} May 06 17:10:47.950373 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.950240 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" event={"ID":"67d17f57-c01f-4b15-8664-18245c28cece","Type":"ContainerStarted","Data":"99d7f3833981ced80d862da24d51142049ed632ff85a498d818e4df584fc2ae2"} May 06 17:10:47.950373 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.950253 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" event={"ID":"67d17f57-c01f-4b15-8664-18245c28cece","Type":"ContainerStarted","Data":"8e861b467646d1ef6ac7b683aeadd120682f5b6f78dfa90431a2f2565b768468"} May 06 17:10:47.950373 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.950266 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" event={"ID":"67d17f57-c01f-4b15-8664-18245c28cece","Type":"ContainerStarted","Data":"ca73e957ca794123db6ff1a57461c07699e359c3a019a3408fad079c6c85c826"} May 06 17:10:47.950373 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.950278 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" event={"ID":"67d17f57-c01f-4b15-8664-18245c28cece","Type":"ContainerStarted","Data":"eaa84a872000be6d89f46589c04ba5941047e47405f6a04f0bbcc99cc7f21f1e"} May 06 17:10:47.951641 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.951598 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-999z4" event={"ID":"3cd5389b-08a4-4cfd-b0c6-38f86b6cdb93","Type":"ContainerStarted","Data":"fe222d77957b07c3245e8394b7125dd82cd563505661330db2a9db911a009fb3"} May 06 17:10:47.953108 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.953083 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f7wdz" event={"ID":"cbdbc6d6-c384-4259-8a3b-f40e37586a30","Type":"ContainerStarted","Data":"558bab90e12ba1904821dce4c2a8c05458d0ad681460a8de0ec90d913b666d0a"} May 06 17:10:47.957540 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.957501 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pwg45" event={"ID":"9225285c-2bd9-49db-af85-96b0a3f45d5a","Type":"ContainerStarted","Data":"36a28d297a54d0e2c8e1062c604f4242991e172eb1ba9687c6fb77ace8fcd153"} May 06 17:10:47.958969 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.958943 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" event={"ID":"0a374dc8-39cb-403e-a34c-6d0d7185b09a","Type":"ContainerStarted","Data":"fe2a159cf442baf86e50421837fce1b39fd0034dbe5f6b5215a433dbb1ae5dd9"} May 06 17:10:47.961146 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.961106 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mtlxr" event={"ID":"c7ac35c9-4cf5-489f-9fd7-4e950edc1678","Type":"ContainerStarted","Data":"87ddf2d44b5436e50849c40ba542c2f8109e3c27b34eb2e755c5e639ca830f68"} May 06 17:10:47.963085 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.963055 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4lbvf" event={"ID":"d07b8d8f-4948-4e10-8402-db41f4c64242","Type":"ContainerStarted","Data":"7fad3ab1553b4d7a70fe0dad5f6ecc69a15180bdaa3443083c257a5f7fb76b2f"} May 06 17:10:47.963164 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.963088 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4lbvf" event={"ID":"d07b8d8f-4948-4e10-8402-db41f4c64242","Type":"ContainerStarted","Data":"c116ca7736be8543bbf382c95fcca8c270b280b60fc6ae530dfc9d1d378faabb"} May 06 17:10:47.987773 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:47.987730 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-f7wdz" podStartSLOduration=4.108796308 podStartE2EDuration="21.987713499s" podCreationTimestamp="2026-05-06 17:10:26 +0000 UTC" firstStartedPulling="2026-05-06 17:10:29.523798279 +0000 UTC m=+3.165195869" lastFinishedPulling="2026-05-06 17:10:47.402715472 +0000 UTC m=+21.044113060" observedRunningTime="2026-05-06 17:10:47.98731152 +0000 UTC m=+21.628709118" watchObservedRunningTime="2026-05-06 17:10:47.987713499 +0000 UTC m=+21.629111099" May 06 17:10:48.001397 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.001354 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mtlxr" podStartSLOduration=8.728894734 podStartE2EDuration="22.001337058s" podCreationTimestamp="2026-05-06 17:10:26 +0000 UTC" firstStartedPulling="2026-05-06 17:10:29.513719201 +0000 UTC m=+3.155116792" lastFinishedPulling="2026-05-06 17:10:42.786161528 +0000 UTC m=+16.427559116" observedRunningTime="2026-05-06 17:10:48.0008309 +0000 UTC m=+21.642228521" watchObservedRunningTime="2026-05-06 17:10:48.001337058 +0000 UTC m=+21.642734658" May 06 17:10:48.014800 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.014763 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pwg45" podStartSLOduration=3.192075276 podStartE2EDuration="21.014753198s" podCreationTimestamp="2026-05-06 17:10:27 +0000 UTC" firstStartedPulling="2026-05-06 17:10:29.524152149 +0000 UTC m=+3.165549741" lastFinishedPulling="2026-05-06 17:10:47.346830083 +0000 UTC m=+20.988227663" observedRunningTime="2026-05-06 17:10:48.014613557 +0000 UTC m=+21.656011156" watchObservedRunningTime="2026-05-06 17:10:48.014753198 +0000 UTC m=+21.656150790" May 06 17:10:48.028846 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.028804 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4lbvf" podStartSLOduration=15.028791536 podStartE2EDuration="15.028791536s" podCreationTimestamp="2026-05-06 17:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:10:48.028126088 +0000 UTC m=+21.669523687" watchObservedRunningTime="2026-05-06 17:10:48.028791536 +0000 UTC m=+21.670189133" May 06 17:10:48.072010 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.071764 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-999z4" podStartSLOduration=4.235127343 podStartE2EDuration="22.071746867s" podCreationTimestamp="2026-05-06 17:10:26 +0000 UTC" firstStartedPulling="2026-05-06 17:10:29.525264162 +0000 UTC m=+3.166661752" lastFinishedPulling="2026-05-06 17:10:47.361883697 +0000 UTC m=+21.003281276" observedRunningTime="2026-05-06 17:10:48.071278544 +0000 UTC m=+21.712676143" watchObservedRunningTime="2026-05-06 17:10:48.071746867 +0000 UTC m=+21.713144466" May 06 17:10:48.745046 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.745022 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" May 06 17:10:48.867076 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.866898 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-05-06T17:10:48.745042281Z","UUID":"e6dc5d7b-e1a0-4792-bc2b-280ddcae7c71","Handler":null,"Name":"","Endpoint":""} May 06 17:10:48.869646 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.869617 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 May 06 17:10:48.869646 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.869645 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock May 06 17:10:48.882761 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.882699 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:48.882761 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.882720 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:48.882986 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:48.882805 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:48.882986 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.882699 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:48.882986 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:48.882896 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:48.883189 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:48.882990 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:48.966597 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.966547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" event={"ID":"0a374dc8-39cb-403e-a34c-6d0d7185b09a","Type":"ContainerStarted","Data":"c9f2a15376a72f5fd83a0ea6f580536cda8694872383987176b25c107771f26c"} May 06 17:10:48.967935 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.967910 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h8tck" event={"ID":"a16518eb-5b61-43e2-b86f-2f0f8bda5d9a","Type":"ContainerStarted","Data":"e4bc85209ccf7984fd75a50901a9ba812f8ba55ca0ddf9aa39a9c4385b7ee2dd"} May 06 17:10:48.970714 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.970687 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" event={"ID":"67d17f57-c01f-4b15-8664-18245c28cece","Type":"ContainerStarted","Data":"127ec15022cab7543b96b7a450fae9903c13c66831992ba9b069ee976dfa1ca2"} May 06 17:10:48.982850 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:48.982816 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-h8tck" podStartSLOduration=5.160990707 podStartE2EDuration="22.982804406s" podCreationTimestamp="2026-05-06 17:10:26 +0000 UTC" firstStartedPulling="2026-05-06 17:10:29.524880189 +0000 UTC m=+3.166277783" lastFinishedPulling="2026-05-06 17:10:47.346693892 +0000 UTC m=+20.988091482" observedRunningTime="2026-05-06 17:10:48.982756152 +0000 UTC m=+22.624153750" watchObservedRunningTime="2026-05-06 17:10:48.982804406 +0000 UTC m=+22.624202007" May 06 17:10:49.974559 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:49.974513 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" event={"ID":"0a374dc8-39cb-403e-a34c-6d0d7185b09a","Type":"ContainerStarted","Data":"31114a12c824143e772af00acbf7cc2541284e4c2ec90665ead60f2ba09df4cb"} May 06 17:10:49.993668 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:49.993634 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tt8fq" podStartSLOduration=2.693081039 podStartE2EDuration="22.993620665s" podCreationTimestamp="2026-05-06 17:10:27 +0000 UTC" firstStartedPulling="2026-05-06 17:10:29.515782011 +0000 UTC m=+3.157179601" lastFinishedPulling="2026-05-06 17:10:49.816321648 +0000 UTC m=+23.457719227" observedRunningTime="2026-05-06 17:10:49.993577731 +0000 UTC m=+23.634975330" watchObservedRunningTime="2026-05-06 17:10:49.993620665 +0000 UTC m=+23.635018253" May 06 17:10:50.880066 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:50.880030 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:50.880245 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:50.880151 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:50.880245 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:50.880160 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:50.880366 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:50.880265 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:50.880366 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:50.880316 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:50.880468 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:50.880396 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:50.980021 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:50.979983 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" event={"ID":"67d17f57-c01f-4b15-8664-18245c28cece","Type":"ContainerStarted","Data":"dce0512b0b1653fbe6a1586caaa55fc784996f30c2c72e32e96ccf8df6d096b4"} May 06 17:10:51.403053 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:51.403011 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mtlxr" May 06 17:10:51.403990 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:51.403967 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mtlxr" May 06 17:10:51.490600 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:51.490554 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mtlxr" May 06 17:10:51.491080 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:51.491052 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mtlxr" May 06 17:10:52.879579 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:52.879414 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:52.880107 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:52.879415 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:52.880107 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:52.879425 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:52.880107 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:52.879704 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:52.880107 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:52.879769 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:52.880107 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:52.879826 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:52.984601 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:52.984555 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae7c9848-2f33-464c-a9e2-b97ba5c1b57c" containerID="877d3edc8d7c2feffc13fa573e32db8a7cb4264ae6851b7a6d01543344d8ace8" exitCode=0 May 06 17:10:52.984766 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:52.984639 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6k67" event={"ID":"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c","Type":"ContainerDied","Data":"877d3edc8d7c2feffc13fa573e32db8a7cb4264ae6851b7a6d01543344d8ace8"} May 06 17:10:52.987842 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:52.987820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" event={"ID":"67d17f57-c01f-4b15-8664-18245c28cece","Type":"ContainerStarted","Data":"b8ed84a09e2df9f494dce141c7116042438c1045238f3dc0b836644327d663ee"} May 06 17:10:53.037381 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:53.037300 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" podStartSLOduration=8.136939415 podStartE2EDuration="26.037286715s" podCreationTimestamp="2026-05-06 17:10:27 +0000 UTC" firstStartedPulling="2026-05-06 17:10:29.512606903 +0000 UTC m=+3.154004496" lastFinishedPulling="2026-05-06 17:10:47.412954219 +0000 UTC m=+21.054351796" observedRunningTime="2026-05-06 17:10:53.035553429 +0000 UTC m=+26.676951025" watchObservedRunningTime="2026-05-06 17:10:53.037286715 +0000 UTC m=+26.678684339" May 06 17:10:53.990962 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:53.990935 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:53.990962 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:53.990972 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:53.991388 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:53.990986 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:54.005914 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:54.005885 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:54.007387 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:54.007367 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:10:54.380944 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:54.380908 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-k855j"] May 06 17:10:54.381121 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:54.381060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:54.381195 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:54.381162 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wvb5c"] May 06 17:10:54.381253 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:54.381188 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:54.381318 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:54.381278 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:54.381422 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:54.381397 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:54.382545 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:54.382513 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4cnzp"] May 06 17:10:54.382666 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:54.382651 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:54.382924 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:54.382855 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:54.993958 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:54.993918 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae7c9848-2f33-464c-a9e2-b97ba5c1b57c" containerID="640af7850d85d0eec2d995cd8fb6ec6f56119b0c1a675a8687609fec95de5a12" exitCode=0 May 06 17:10:54.994536 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:54.993996 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6k67" event={"ID":"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c","Type":"ContainerDied","Data":"640af7850d85d0eec2d995cd8fb6ec6f56119b0c1a675a8687609fec95de5a12"} May 06 17:10:55.879904 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:55.879874 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:55.880059 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:55.879987 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:55.880059 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:55.880004 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:55.880140 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:55.880063 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:55.880140 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:55.880092 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:55.880207 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:55.880178 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:56.999739 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:56.999699 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae7c9848-2f33-464c-a9e2-b97ba5c1b57c" containerID="95d22847c7955df974b8121db7fa15c89daef6ad94e5bd29c34619dffab36cb2" exitCode=0 May 06 17:10:57.000181 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:56.999764 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6k67" event={"ID":"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c","Type":"ContainerDied","Data":"95d22847c7955df974b8121db7fa15c89daef6ad94e5bd29c34619dffab36cb2"} May 06 17:10:57.880073 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:57.880026 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:57.880073 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:57.880048 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:57.880310 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:57.880145 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:57.880310 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:57.880145 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:57.880310 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:57.880225 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:57.880450 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:57.880316 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:10:59.880355 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:59.880192 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:10:59.880751 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:59.880225 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:10:59.880751 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:59.880434 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4cnzp" podUID="e0f197dc-dd5c-4f23-ad0d-f076fc70415f" May 06 17:10:59.880751 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:59.880530 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k855j" podUID="c55e3ee0-9cb2-41dc-9751-75fb3367d774" May 06 17:10:59.880751 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:10:59.880240 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:10:59.880751 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:10:59.880654 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvb5c" podUID="8a7c9e10-c274-46eb-b020-deee09868a53" May 06 17:11:00.151494 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.151397 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-115.ec2.internal" event="NodeReady" May 06 17:11:00.151674 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.151553 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" May 06 17:11:00.217542 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.217506 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-68c565b89b-trfwd"] May 06 17:11:00.221504 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.221482 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.228625 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.228604 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" May 06 17:11:00.228745 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.228726 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" May 06 17:11:00.230911 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.230891 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7cpjk\"" May 06 17:11:00.231013 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.230894 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" May 06 17:11:00.241828 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.241808 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" May 06 17:11:00.246954 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.246934 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68c565b89b-trfwd"] May 06 17:11:00.254396 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.254364 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-b4hs8"] May 06 17:11:00.257880 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.257861 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-d774v"] May 06 17:11:00.258031 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.258015 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b4hs8" May 06 17:11:00.261422 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.261404 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d774v" May 06 17:11:00.263456 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.263439 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" May 06 17:11:00.264052 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.264033 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" May 06 17:11:00.264150 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.264118 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hfv2w\"" May 06 17:11:00.264209 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.264117 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" May 06 17:11:00.264263 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.264245 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-m5w6t\"" May 06 17:11:00.264559 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.264543 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" May 06 17:11:00.264669 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.264597 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" May 06 17:11:00.281040 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.281022 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-b4hs8"] May 06 17:11:00.303744 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.303708 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d774v"] May 06 17:11:00.366666 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.366637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-ca-trust-extracted\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.366815 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.366687 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-installation-pull-secrets\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.366815 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.366715 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-bound-sa-token\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.366815 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.366736 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-trusted-ca\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.366815 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.366751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9b2h\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-kube-api-access-l9b2h\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.366815 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.366770 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert\") pod \"ingress-canary-d774v\" (UID: \"40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed\") " pod="openshift-ingress-canary/ingress-canary-d774v" May 06 17:11:00.366815 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.366797 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-image-registry-private-configuration\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.367045 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.366823 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.367045 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.366874 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:00.367045 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.366933 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bf278082-faf7-485f-9ac3-52430773540c-tmp-dir\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:00.367045 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.366983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ds87\" (UniqueName: \"kubernetes.io/projected/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-kube-api-access-7ds87\") pod \"ingress-canary-d774v\" (UID: \"40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed\") " pod="openshift-ingress-canary/ingress-canary-d774v" May 06 17:11:00.367045 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.367016 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-certificates\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.367045 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.367043 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf278082-faf7-485f-9ac3-52430773540c-config-volume\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:00.367262 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.367075 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r5lc\" (UniqueName: \"kubernetes.io/projected/bf278082-faf7-485f-9ac3-52430773540c-kube-api-access-7r5lc\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:00.467765 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.467734 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-certificates\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.467927 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.467783 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf278082-faf7-485f-9ac3-52430773540c-config-volume\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:00.467927 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.467917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r5lc\" (UniqueName: \"kubernetes.io/projected/bf278082-faf7-485f-9ac3-52430773540c-kube-api-access-7r5lc\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:00.468037 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.467987 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-ca-trust-extracted\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.468037 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.468026 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-installation-pull-secrets\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.468117 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.468051 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-bound-sa-token\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.468117 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.468073 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-trusted-ca\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.468117 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.468095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9b2h\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-kube-api-access-l9b2h\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.468285 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.468134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert\") pod \"ingress-canary-d774v\" (UID: \"40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed\") " pod="openshift-ingress-canary/ingress-canary-d774v" May 06 17:11:00.468285 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.468162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-image-registry-private-configuration\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.468285 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.468185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.468285 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.468217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:00.468285 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.468239 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bf278082-faf7-485f-9ac3-52430773540c-tmp-dir\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:00.468285 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.468262 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ds87\" (UniqueName: \"kubernetes.io/projected/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-kube-api-access-7ds87\") pod \"ingress-canary-d774v\" (UID: \"40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed\") " pod="openshift-ingress-canary/ingress-canary-d774v" May 06 17:11:00.468503 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.468332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf278082-faf7-485f-9ac3-52430773540c-config-volume\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:00.468503 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.468388 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-certificates\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.468625 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.468607 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 06 17:11:00.468691 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.468628 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c565b89b-trfwd: secret "image-registry-tls" not found May 06 17:11:00.468691 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.468681 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls podName:3954deed-cfd3-49d8-8c8a-f057a70b5e6b nodeName:}" failed. No retries permitted until 2026-05-06 17:11:00.968664249 +0000 UTC m=+34.610061837 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls") pod "image-registry-68c565b89b-trfwd" (UID: "3954deed-cfd3-49d8-8c8a-f057a70b5e6b") : secret "image-registry-tls" not found May 06 17:11:00.469212 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.468923 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 06 17:11:00.469212 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.468969 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls podName:bf278082-faf7-485f-9ac3-52430773540c nodeName:}" failed. No retries permitted until 2026-05-06 17:11:00.968954826 +0000 UTC m=+34.610352402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls") pod "dns-default-b4hs8" (UID: "bf278082-faf7-485f-9ac3-52430773540c") : secret "dns-default-metrics-tls" not found May 06 17:11:00.469212 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.469033 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-trusted-ca\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.469212 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.469112 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 06 17:11:00.469212 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.469163 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert podName:40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed nodeName:}" failed. No retries permitted until 2026-05-06 17:11:00.969147725 +0000 UTC m=+34.610545304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert") pod "ingress-canary-d774v" (UID: "40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed") : secret "canary-serving-cert" not found May 06 17:11:00.469212 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.469181 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bf278082-faf7-485f-9ac3-52430773540c-tmp-dir\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:00.469557 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.469345 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-ca-trust-extracted\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.472779 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.472755 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-image-registry-private-configuration\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.472914 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.472812 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-installation-pull-secrets\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.482766 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.482745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r5lc\" (UniqueName: \"kubernetes.io/projected/bf278082-faf7-485f-9ac3-52430773540c-kube-api-access-7r5lc\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:00.482861 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.482770 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9b2h\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-kube-api-access-l9b2h\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.490535 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.490514 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-bound-sa-token\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.492424 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.492399 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ds87\" (UniqueName: \"kubernetes.io/projected/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-kube-api-access-7ds87\") pod \"ingress-canary-d774v\" (UID: \"40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed\") " pod="openshift-ingress-canary/ingress-canary-d774v" May 06 17:11:00.569070 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.569041 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs\") pod \"network-metrics-daemon-4cnzp\" (UID: \"e0f197dc-dd5c-4f23-ad0d-f076fc70415f\") " pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:11:00.569228 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.569194 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:11:00.569299 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.569278 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs podName:e0f197dc-dd5c-4f23-ad0d-f076fc70415f nodeName:}" failed. No retries permitted until 2026-05-06 17:11:32.569255808 +0000 UTC m=+66.210653384 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs") pod "network-metrics-daemon-4cnzp" (UID: "e0f197dc-dd5c-4f23-ad0d-f076fc70415f") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:11:00.770256 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.770172 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29mkk\" (UniqueName: \"kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk\") pod \"network-check-target-k855j\" (UID: \"c55e3ee0-9cb2-41dc-9751-75fb3367d774\") " pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:11:00.770417 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.770354 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 06 17:11:00.770417 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.770379 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 06 17:11:00.770417 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.770392 2578 projected.go:194] Error preparing data for projected volume kube-api-access-29mkk for pod openshift-network-diagnostics/network-check-target-k855j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:11:00.770539 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.770455 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk podName:c55e3ee0-9cb2-41dc-9751-75fb3367d774 nodeName:}" failed. No retries permitted until 2026-05-06 17:11:32.770436151 +0000 UTC m=+66.411833741 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-29mkk" (UniqueName: "kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk") pod "network-check-target-k855j" (UID: "c55e3ee0-9cb2-41dc-9751-75fb3367d774") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:11:00.971777 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.971736 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert\") pod \"ingress-canary-d774v\" (UID: \"40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed\") " pod="openshift-ingress-canary/ingress-canary-d774v" May 06 17:11:00.972206 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.971790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:00.972206 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.971893 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 06 17:11:00.972206 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.971899 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 06 17:11:00.972206 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:00.971918 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:00.972206 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.971906 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c565b89b-trfwd: secret "image-registry-tls" not found May 06 17:11:00.972206 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.971979 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert podName:40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed nodeName:}" failed. No retries permitted until 2026-05-06 17:11:01.971959264 +0000 UTC m=+35.613356854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert") pod "ingress-canary-d774v" (UID: "40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed") : secret "canary-serving-cert" not found May 06 17:11:00.972206 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.972010 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls podName:3954deed-cfd3-49d8-8c8a-f057a70b5e6b nodeName:}" failed. No retries permitted until 2026-05-06 17:11:01.971992377 +0000 UTC m=+35.613389956 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls") pod "image-registry-68c565b89b-trfwd" (UID: "3954deed-cfd3-49d8-8c8a-f057a70b5e6b") : secret "image-registry-tls" not found May 06 17:11:00.972206 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.972034 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 06 17:11:00.972206 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:00.972082 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls podName:bf278082-faf7-485f-9ac3-52430773540c nodeName:}" failed. No retries permitted until 2026-05-06 17:11:01.972068716 +0000 UTC m=+35.613466312 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls") pod "dns-default-b4hs8" (UID: "bf278082-faf7-485f-9ac3-52430773540c") : secret "dns-default-metrics-tls" not found May 06 17:11:01.035528 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.035452 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4lbvf_d07b8d8f-4948-4e10-8402-db41f4c64242/dns-node-resolver/0.log" May 06 17:11:01.173401 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.173363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:11:01.173564 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:01.173509 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 06 17:11:01.173650 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:01.173600 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret podName:8a7c9e10-c274-46eb-b020-deee09868a53 nodeName:}" failed. No retries permitted until 2026-05-06 17:11:33.173563892 +0000 UTC m=+66.814961467 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret") pod "global-pull-secret-syncer-wvb5c" (UID: "8a7c9e10-c274-46eb-b020-deee09868a53") : object "kube-system"/"original-pull-secret" not registered May 06 17:11:01.879748 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.879654 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:11:01.879748 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.879695 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:11:01.879992 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.879775 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:11:01.884008 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.883987 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" May 06 17:11:01.884008 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.884000 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" May 06 17:11:01.884157 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.884046 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7hgjs\"" May 06 17:11:01.884203 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.884179 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" May 06 17:11:01.884254 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.884241 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c658w\"" May 06 17:11:01.884300 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.884269 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" May 06 17:11:01.983615 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.983563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:01.984020 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.983631 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:01.984020 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:01.983722 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert\") pod \"ingress-canary-d774v\" (UID: \"40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed\") " pod="openshift-ingress-canary/ingress-canary-d774v" May 06 17:11:01.984020 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:01.983734 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 06 17:11:01.984020 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:01.983755 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c565b89b-trfwd: secret "image-registry-tls" not found May 06 17:11:01.984020 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:01.983812 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls podName:3954deed-cfd3-49d8-8c8a-f057a70b5e6b nodeName:}" failed. No retries permitted until 2026-05-06 17:11:03.983793364 +0000 UTC m=+37.625190956 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls") pod "image-registry-68c565b89b-trfwd" (UID: "3954deed-cfd3-49d8-8c8a-f057a70b5e6b") : secret "image-registry-tls" not found May 06 17:11:01.984020 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:01.983810 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 06 17:11:01.984020 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:01.983847 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert podName:40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed nodeName:}" failed. No retries permitted until 2026-05-06 17:11:03.983836751 +0000 UTC m=+37.625234327 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert") pod "ingress-canary-d774v" (UID: "40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed") : secret "canary-serving-cert" not found May 06 17:11:01.984020 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:01.983860 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 06 17:11:01.984020 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:01.983913 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls podName:bf278082-faf7-485f-9ac3-52430773540c nodeName:}" failed. No retries permitted until 2026-05-06 17:11:03.983900066 +0000 UTC m=+37.625297641 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls") pod "dns-default-b4hs8" (UID: "bf278082-faf7-485f-9ac3-52430773540c") : secret "dns-default-metrics-tls" not found May 06 17:11:02.207093 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:02.207057 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pwg45_9225285c-2bd9-49db-af85-96b0a3f45d5a/node-ca/0.log" May 06 17:11:02.476062 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:02.475988 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-5f598d4645-z425g"] May 06 17:11:02.506964 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:02.506927 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-5f598d4645-z425g"] May 06 17:11:02.507113 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:02.507073 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-z425g" May 06 17:11:02.510074 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:02.510023 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" May 06 17:11:02.511077 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:02.511053 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" May 06 17:11:02.511211 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:02.511061 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-hgms8\"" May 06 17:11:02.587691 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:02.587654 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dvsf\" (UniqueName: \"kubernetes.io/projected/a8efde7e-259d-44b2-8529-28c7ccbbff38-kube-api-access-6dvsf\") pod \"migrator-5f598d4645-z425g\" (UID: \"a8efde7e-259d-44b2-8529-28c7ccbbff38\") " pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-z425g" May 06 17:11:02.688340 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:02.688309 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dvsf\" (UniqueName: \"kubernetes.io/projected/a8efde7e-259d-44b2-8529-28c7ccbbff38-kube-api-access-6dvsf\") pod \"migrator-5f598d4645-z425g\" (UID: \"a8efde7e-259d-44b2-8529-28c7ccbbff38\") " pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-z425g" May 06 17:11:02.696577 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:02.696551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dvsf\" (UniqueName: \"kubernetes.io/projected/a8efde7e-259d-44b2-8529-28c7ccbbff38-kube-api-access-6dvsf\") pod \"migrator-5f598d4645-z425g\" (UID: \"a8efde7e-259d-44b2-8529-28c7ccbbff38\") " pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-z425g" May 06 17:11:02.816474 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:02.816400 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-z425g" May 06 17:11:03.122491 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:03.122353 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-5f598d4645-z425g"] May 06 17:11:03.125839 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:11:03.125816 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8efde7e_259d_44b2_8529_28c7ccbbff38.slice/crio-3bc53d2a830bf7f0fa670e64f8da54fc4ffc2fd6b9d2b40a9d2f6890e1ba4e71 WatchSource:0}: Error finding container 3bc53d2a830bf7f0fa670e64f8da54fc4ffc2fd6b9d2b40a9d2f6890e1ba4e71: Status 404 returned error can't find the container with id 3bc53d2a830bf7f0fa670e64f8da54fc4ffc2fd6b9d2b40a9d2f6890e1ba4e71 May 06 17:11:03.995381 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:03.995193 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert\") pod \"ingress-canary-d774v\" (UID: \"40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed\") " pod="openshift-ingress-canary/ingress-canary-d774v" May 06 17:11:03.995381 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:03.995241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:03.995381 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:03.995273 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:03.995648 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:03.995392 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 06 17:11:03.995648 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:03.995444 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls podName:bf278082-faf7-485f-9ac3-52430773540c nodeName:}" failed. No retries permitted until 2026-05-06 17:11:07.995429555 +0000 UTC m=+41.636827130 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls") pod "dns-default-b4hs8" (UID: "bf278082-faf7-485f-9ac3-52430773540c") : secret "dns-default-metrics-tls" not found May 06 17:11:03.995970 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:03.995865 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 06 17:11:03.995970 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:03.995887 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c565b89b-trfwd: secret "image-registry-tls" not found May 06 17:11:03.995970 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:03.995935 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 06 17:11:03.996172 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:03.995947 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls podName:3954deed-cfd3-49d8-8c8a-f057a70b5e6b nodeName:}" failed. No retries permitted until 2026-05-06 17:11:07.995928467 +0000 UTC m=+41.637326042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls") pod "image-registry-68c565b89b-trfwd" (UID: "3954deed-cfd3-49d8-8c8a-f057a70b5e6b") : secret "image-registry-tls" not found May 06 17:11:03.996172 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:03.996002 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert podName:40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed nodeName:}" failed. No retries permitted until 2026-05-06 17:11:07.995985457 +0000 UTC m=+41.637383037 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert") pod "ingress-canary-d774v" (UID: "40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed") : secret "canary-serving-cert" not found May 06 17:11:04.014522 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:04.014490 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-z425g" event={"ID":"a8efde7e-259d-44b2-8529-28c7ccbbff38","Type":"ContainerStarted","Data":"3bc53d2a830bf7f0fa670e64f8da54fc4ffc2fd6b9d2b40a9d2f6890e1ba4e71"} May 06 17:11:04.017205 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:04.017178 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae7c9848-2f33-464c-a9e2-b97ba5c1b57c" containerID="6e8bee6546ace5aab6daae88dffcdf74c31a46465cc286f792791f42964d793d" exitCode=0 May 06 17:11:04.017314 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:04.017228 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6k67" event={"ID":"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c","Type":"ContainerDied","Data":"6e8bee6546ace5aab6daae88dffcdf74c31a46465cc286f792791f42964d793d"} May 06 17:11:05.021241 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:05.021175 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae7c9848-2f33-464c-a9e2-b97ba5c1b57c" containerID="e317ee2d18b50a50870f00a1de41695f7390f02cd77a29422b2a8d5c54d3ca6e" exitCode=0 May 06 17:11:05.021941 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:05.021256 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6k67" event={"ID":"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c","Type":"ContainerDied","Data":"e317ee2d18b50a50870f00a1de41695f7390f02cd77a29422b2a8d5c54d3ca6e"} May 06 17:11:05.022706 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:05.022683 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-z425g" event={"ID":"a8efde7e-259d-44b2-8529-28c7ccbbff38","Type":"ContainerStarted","Data":"a987c17fd06c3039bebfe8141d304bd70897803c12136880be460ac92b500a40"} May 06 17:11:05.022801 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:05.022712 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-z425g" event={"ID":"a8efde7e-259d-44b2-8529-28c7ccbbff38","Type":"ContainerStarted","Data":"3c8e797c0bd1d749d7a812375a6dc42452228181537a666d33ced2833a9bc560"} May 06 17:11:05.059877 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:05.059833 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-z425g" podStartSLOduration=1.4292965 podStartE2EDuration="3.059821856s" podCreationTimestamp="2026-05-06 17:11:02 +0000 UTC" firstStartedPulling="2026-05-06 17:11:03.128083085 +0000 UTC m=+36.769480662" lastFinishedPulling="2026-05-06 17:11:04.758608435 +0000 UTC m=+38.400006018" observedRunningTime="2026-05-06 17:11:05.059508375 +0000 UTC m=+38.700905971" watchObservedRunningTime="2026-05-06 17:11:05.059821856 +0000 UTC m=+38.701219448" May 06 17:11:06.026930 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:06.026890 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6k67" event={"ID":"ae7c9848-2f33-464c-a9e2-b97ba5c1b57c","Type":"ContainerStarted","Data":"bef4a03e1cecf6fcb774704b38639cbf0ad92b5c768e46a11ad55bcbc932c1c0"} May 06 17:11:06.053530 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:06.053481 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-z6k67" podStartSLOduration=5.591859481 podStartE2EDuration="39.053465697s" podCreationTimestamp="2026-05-06 17:10:27 +0000 UTC" firstStartedPulling="2026-05-06 17:10:29.51769914 +0000 UTC m=+3.159096720" lastFinishedPulling="2026-05-06 17:11:02.97930536 +0000 UTC m=+36.620702936" observedRunningTime="2026-05-06 17:11:06.051728897 +0000 UTC m=+39.693126490" watchObservedRunningTime="2026-05-06 17:11:06.053465697 +0000 UTC m=+39.694863294" May 06 17:11:08.026771 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:08.026739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert\") pod \"ingress-canary-d774v\" (UID: \"40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed\") " pod="openshift-ingress-canary/ingress-canary-d774v" May 06 17:11:08.026771 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:08.026776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:08.027199 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:08.026800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:08.027199 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:08.026890 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 06 17:11:08.027199 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:08.026914 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c565b89b-trfwd: secret "image-registry-tls" not found May 06 17:11:08.027199 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:08.026891 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 06 17:11:08.027199 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:08.026898 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 06 17:11:08.027199 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:08.026964 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls podName:3954deed-cfd3-49d8-8c8a-f057a70b5e6b nodeName:}" failed. No retries permitted until 2026-05-06 17:11:16.026947232 +0000 UTC m=+49.668344809 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls") pod "image-registry-68c565b89b-trfwd" (UID: "3954deed-cfd3-49d8-8c8a-f057a70b5e6b") : secret "image-registry-tls" not found May 06 17:11:08.027199 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:08.026976 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls podName:bf278082-faf7-485f-9ac3-52430773540c nodeName:}" failed. No retries permitted until 2026-05-06 17:11:16.026970796 +0000 UTC m=+49.668368372 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls") pod "dns-default-b4hs8" (UID: "bf278082-faf7-485f-9ac3-52430773540c") : secret "dns-default-metrics-tls" not found May 06 17:11:08.027199 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:08.026987 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert podName:40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed nodeName:}" failed. No retries permitted until 2026-05-06 17:11:16.026982548 +0000 UTC m=+49.668380124 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert") pod "ingress-canary-d774v" (UID: "40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed") : secret "canary-serving-cert" not found May 06 17:11:16.090781 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:16.090743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert\") pod \"ingress-canary-d774v\" (UID: \"40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed\") " pod="openshift-ingress-canary/ingress-canary-d774v" May 06 17:11:16.090781 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:16.090786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:16.091338 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:16.090973 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:16.094237 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:16.094207 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls\") pod \"image-registry-68c565b89b-trfwd\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:16.094237 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:16.094229 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf278082-faf7-485f-9ac3-52430773540c-metrics-tls\") pod \"dns-default-b4hs8\" (UID: \"bf278082-faf7-485f-9ac3-52430773540c\") " pod="openshift-dns/dns-default-b4hs8" May 06 17:11:16.094400 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:16.094256 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed-cert\") pod \"ingress-canary-d774v\" (UID: \"40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed\") " pod="openshift-ingress-canary/ingress-canary-d774v" May 06 17:11:16.136394 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:16.136367 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:16.168698 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:16.168669 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b4hs8" May 06 17:11:16.173910 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:16.173888 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d774v" May 06 17:11:16.288797 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:16.288765 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68c565b89b-trfwd"] May 06 17:11:16.294171 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:11:16.294144 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3954deed_cfd3_49d8_8c8a_f057a70b5e6b.slice/crio-aecac6a1da54446e59ca89c6b2e2ea4099d9c91b4c8606e7ad82dc84c1bcbd43 WatchSource:0}: Error finding container aecac6a1da54446e59ca89c6b2e2ea4099d9c91b4c8606e7ad82dc84c1bcbd43: Status 404 returned error can't find the container with id aecac6a1da54446e59ca89c6b2e2ea4099d9c91b4c8606e7ad82dc84c1bcbd43 May 06 17:11:16.326682 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:16.326657 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-b4hs8"] May 06 17:11:16.329753 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:11:16.329726 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf278082_faf7_485f_9ac3_52430773540c.slice/crio-66f53c1f083ab36c2b044166f1712da47e3428f63925f87a1590dade26c619bd WatchSource:0}: Error finding container 66f53c1f083ab36c2b044166f1712da47e3428f63925f87a1590dade26c619bd: Status 404 returned error can't find the container with id 66f53c1f083ab36c2b044166f1712da47e3428f63925f87a1590dade26c619bd May 06 17:11:16.339884 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:16.339862 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d774v"] May 06 17:11:16.342676 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:11:16.342653 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40f1eb9d_6ab0_426e_aee9_f57d3f7a16ed.slice/crio-3ea9d33fd78bccd6d61deb09283f8bd6e4413ceaf05b96dcb11cc1c575e106ab WatchSource:0}: Error finding container 3ea9d33fd78bccd6d61deb09283f8bd6e4413ceaf05b96dcb11cc1c575e106ab: Status 404 returned error can't find the container with id 3ea9d33fd78bccd6d61deb09283f8bd6e4413ceaf05b96dcb11cc1c575e106ab May 06 17:11:17.048440 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:17.048398 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b4hs8" event={"ID":"bf278082-faf7-485f-9ac3-52430773540c","Type":"ContainerStarted","Data":"66f53c1f083ab36c2b044166f1712da47e3428f63925f87a1590dade26c619bd"} May 06 17:11:17.049686 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:17.049656 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d774v" event={"ID":"40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed","Type":"ContainerStarted","Data":"3ea9d33fd78bccd6d61deb09283f8bd6e4413ceaf05b96dcb11cc1c575e106ab"} May 06 17:11:17.050888 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:17.050855 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68c565b89b-trfwd" event={"ID":"3954deed-cfd3-49d8-8c8a-f057a70b5e6b","Type":"ContainerStarted","Data":"10cb86e598b92204184518019c1a73d960edcedfb3e20032cdd98da47894beb2"} May 06 17:11:17.050888 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:17.050887 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68c565b89b-trfwd" event={"ID":"3954deed-cfd3-49d8-8c8a-f057a70b5e6b","Type":"ContainerStarted","Data":"aecac6a1da54446e59ca89c6b2e2ea4099d9c91b4c8606e7ad82dc84c1bcbd43"} May 06 17:11:17.051145 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:17.051123 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:17.081275 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:17.081222 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-68c565b89b-trfwd" podStartSLOduration=58.081208239 podStartE2EDuration="58.081208239s" podCreationTimestamp="2026-05-06 17:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:11:17.080513689 +0000 UTC m=+50.721911315" watchObservedRunningTime="2026-05-06 17:11:17.081208239 +0000 UTC m=+50.722605837" May 06 17:11:19.056471 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:19.056428 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b4hs8" event={"ID":"bf278082-faf7-485f-9ac3-52430773540c","Type":"ContainerStarted","Data":"d2a5d1b05e015bb79ceac0768c2e0eeebd34f46d3513b1ad39261b1fc0e3839e"} May 06 17:11:19.056471 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:19.056467 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b4hs8" event={"ID":"bf278082-faf7-485f-9ac3-52430773540c","Type":"ContainerStarted","Data":"f62264b8799ac650dc85c3a391a27bf7802c2cea393129722f7ae5bd1aaf5191"} May 06 17:11:19.056975 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:19.056614 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-b4hs8" May 06 17:11:19.057803 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:19.057781 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d774v" event={"ID":"40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed","Type":"ContainerStarted","Data":"6284f7f70ac55d132c3e156be92bf891e10aee61ef44f0134a98b51b0c385818"} May 06 17:11:19.074298 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:19.074260 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-b4hs8" podStartSLOduration=16.736525190000002 podStartE2EDuration="19.074250285s" podCreationTimestamp="2026-05-06 17:11:00 +0000 UTC" firstStartedPulling="2026-05-06 17:11:16.331387597 +0000 UTC m=+49.972785173" lastFinishedPulling="2026-05-06 17:11:18.669112687 +0000 UTC m=+52.310510268" observedRunningTime="2026-05-06 17:11:19.073382851 +0000 UTC m=+52.714780452" watchObservedRunningTime="2026-05-06 17:11:19.074250285 +0000 UTC m=+52.715647882" May 06 17:11:22.047739 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.047682 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-d774v" podStartSLOduration=19.717924904 podStartE2EDuration="22.047665378s" podCreationTimestamp="2026-05-06 17:11:00 +0000 UTC" firstStartedPulling="2026-05-06 17:11:16.344432986 +0000 UTC m=+49.985830562" lastFinishedPulling="2026-05-06 17:11:18.67417346 +0000 UTC m=+52.315571036" observedRunningTime="2026-05-06 17:11:19.08830522 +0000 UTC m=+52.729702818" watchObservedRunningTime="2026-05-06 17:11:22.047665378 +0000 UTC m=+55.689063009" May 06 17:11:22.048109 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.047930 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-tgn76"] May 06 17:11:22.052025 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.052009 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.054887 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.054867 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" May 06 17:11:22.055855 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.055840 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nrl57\"" May 06 17:11:22.056026 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.056014 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" May 06 17:11:22.056219 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.056079 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" May 06 17:11:22.056531 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.056518 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" May 06 17:11:22.070805 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.070780 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tgn76"] May 06 17:11:22.085504 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.085483 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68c565b89b-trfwd"] May 06 17:11:22.137427 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.137400 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-data-volume\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.137600 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.137444 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.137600 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.137471 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.137600 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.137529 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-crio-socket\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.137600 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.137547 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8c6b\" (UniqueName: \"kubernetes.io/projected/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-kube-api-access-m8c6b\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.238700 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.238663 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.238700 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.238697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.238931 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.238742 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-crio-socket\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.238931 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.238761 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8c6b\" (UniqueName: \"kubernetes.io/projected/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-kube-api-access-m8c6b\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.238931 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.238812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-data-volume\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.239075 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.238982 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-crio-socket\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.239139 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.239119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-data-volume\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.239239 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.239215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.241019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.241000 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.263879 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.263854 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8c6b\" (UniqueName: \"kubernetes.io/projected/ab661ca8-1bc3-42f1-843e-1b450eaffbfc-kube-api-access-m8c6b\") pod \"insights-runtime-extractor-tgn76\" (UID: \"ab661ca8-1bc3-42f1-843e-1b450eaffbfc\") " pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.360765 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.360680 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tgn76" May 06 17:11:22.504509 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:11:22.504479 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab661ca8_1bc3_42f1_843e_1b450eaffbfc.slice/crio-436bb6fb401843042c8d38d92b75f8c5f1d77f31c3b890bc6d7f82441e9b2674 WatchSource:0}: Error finding container 436bb6fb401843042c8d38d92b75f8c5f1d77f31c3b890bc6d7f82441e9b2674: Status 404 returned error can't find the container with id 436bb6fb401843042c8d38d92b75f8c5f1d77f31c3b890bc6d7f82441e9b2674 May 06 17:11:22.508799 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:22.508774 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tgn76"] May 06 17:11:23.071783 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:23.071748 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tgn76" event={"ID":"ab661ca8-1bc3-42f1-843e-1b450eaffbfc","Type":"ContainerStarted","Data":"e19fc504476b261f8aa0486d8ff9581fb42f805b97e9707e390ab13e7b76a38f"} May 06 17:11:23.071783 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:23.071788 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tgn76" event={"ID":"ab661ca8-1bc3-42f1-843e-1b450eaffbfc","Type":"ContainerStarted","Data":"436bb6fb401843042c8d38d92b75f8c5f1d77f31c3b890bc6d7f82441e9b2674"} May 06 17:11:24.076463 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:24.076423 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tgn76" event={"ID":"ab661ca8-1bc3-42f1-843e-1b450eaffbfc","Type":"ContainerStarted","Data":"45c48f2bf8f7594f9985ec89da97e711fdea457cd384e8864469215e0ca7d76a"} May 06 17:11:25.080565 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:25.080479 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tgn76" event={"ID":"ab661ca8-1bc3-42f1-843e-1b450eaffbfc","Type":"ContainerStarted","Data":"d9d6596ccd786587f200ee58bdb1b8a6b32d429f2ab0725d7f333134106b7267"} May 06 17:11:25.100263 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:25.100034 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-tgn76" podStartSLOduration=0.970166121 podStartE2EDuration="3.100014058s" podCreationTimestamp="2026-05-06 17:11:22 +0000 UTC" firstStartedPulling="2026-05-06 17:11:22.557390896 +0000 UTC m=+56.198788472" lastFinishedPulling="2026-05-06 17:11:24.687238819 +0000 UTC m=+58.328636409" observedRunningTime="2026-05-06 17:11:25.098919388 +0000 UTC m=+58.740316986" watchObservedRunningTime="2026-05-06 17:11:25.100014058 +0000 UTC m=+58.741411671" May 06 17:11:26.008987 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:26.008959 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lz4g2" May 06 17:11:29.062731 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:29.062696 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-b4hs8" May 06 17:11:32.091142 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:32.091106 2578 patch_prober.go:28] interesting pod/image-registry-68c565b89b-trfwd container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} May 06 17:11:32.091606 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:32.091163 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-68c565b89b-trfwd" podUID="3954deed-cfd3-49d8-8c8a-f057a70b5e6b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" May 06 17:11:32.614120 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:32.614084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs\") pod \"network-metrics-daemon-4cnzp\" (UID: \"e0f197dc-dd5c-4f23-ad0d-f076fc70415f\") " pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:11:32.616741 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:32.616720 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" May 06 17:11:32.627181 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:32.627151 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0f197dc-dd5c-4f23-ad0d-f076fc70415f-metrics-certs\") pod \"network-metrics-daemon-4cnzp\" (UID: \"e0f197dc-dd5c-4f23-ad0d-f076fc70415f\") " pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:11:32.803173 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:32.803144 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7hgjs\"" May 06 17:11:32.810464 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:32.810443 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4cnzp" May 06 17:11:32.816363 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:32.816338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29mkk\" (UniqueName: \"kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk\") pod \"network-check-target-k855j\" (UID: \"c55e3ee0-9cb2-41dc-9751-75fb3367d774\") " pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:11:32.819105 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:32.819078 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" May 06 17:11:32.830169 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:32.830140 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" May 06 17:11:32.840764 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:32.840730 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29mkk\" (UniqueName: \"kubernetes.io/projected/c55e3ee0-9cb2-41dc-9751-75fb3367d774-kube-api-access-29mkk\") pod \"network-check-target-k855j\" (UID: \"c55e3ee0-9cb2-41dc-9751-75fb3367d774\") " pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:11:32.934573 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:32.934540 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4cnzp"] May 06 17:11:32.937397 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:11:32.937364 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0f197dc_dd5c_4f23_ad0d_f076fc70415f.slice/crio-4f45e2572819441cd55b32ceb3e1ff420463622640d99703ff091416d66dab41 WatchSource:0}: Error finding container 4f45e2572819441cd55b32ceb3e1ff420463622640d99703ff091416d66dab41: Status 404 returned error can't find the container with id 4f45e2572819441cd55b32ceb3e1ff420463622640d99703ff091416d66dab41 May 06 17:11:33.100747 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:33.100710 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4cnzp" event={"ID":"e0f197dc-dd5c-4f23-ad0d-f076fc70415f","Type":"ContainerStarted","Data":"4f45e2572819441cd55b32ceb3e1ff420463622640d99703ff091416d66dab41"} May 06 17:11:33.108494 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:33.108473 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c658w\"" May 06 17:11:33.116865 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:33.116844 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:11:33.219233 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:33.219203 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:11:33.222157 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:33.222135 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" May 06 17:11:33.231559 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:33.231535 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a7c9e10-c274-46eb-b020-deee09868a53-original-pull-secret\") pod \"global-pull-secret-syncer-wvb5c\" (UID: \"8a7c9e10-c274-46eb-b020-deee09868a53\") " pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:11:33.237890 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:33.237863 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-k855j"] May 06 17:11:33.240986 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:11:33.240962 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc55e3ee0_9cb2_41dc_9751_75fb3367d774.slice/crio-09c86d6479eef7757786d9c06a14aeaa8012f049686c7986f89c999c9210240a WatchSource:0}: Error finding container 09c86d6479eef7757786d9c06a14aeaa8012f049686c7986f89c999c9210240a: Status 404 returned error can't find the container with id 09c86d6479eef7757786d9c06a14aeaa8012f049686c7986f89c999c9210240a May 06 17:11:33.392696 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:33.392660 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvb5c" May 06 17:11:33.536670 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:33.536637 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wvb5c"] May 06 17:11:34.105091 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:34.105045 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wvb5c" event={"ID":"8a7c9e10-c274-46eb-b020-deee09868a53","Type":"ContainerStarted","Data":"8f5ea659cef09051b7bed39b67d72eb223b16aec70cb8ca2b2d8d0b31a4b85e4"} May 06 17:11:34.106318 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:34.106275 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-k855j" event={"ID":"c55e3ee0-9cb2-41dc-9751-75fb3367d774","Type":"ContainerStarted","Data":"09c86d6479eef7757786d9c06a14aeaa8012f049686c7986f89c999c9210240a"} May 06 17:11:35.110786 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:35.110675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4cnzp" event={"ID":"e0f197dc-dd5c-4f23-ad0d-f076fc70415f","Type":"ContainerStarted","Data":"a47b1bfe76af2260cf1924db11064402cef27f751d87bedd41fdba856625a16c"} May 06 17:11:35.110786 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:35.110725 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4cnzp" event={"ID":"e0f197dc-dd5c-4f23-ad0d-f076fc70415f","Type":"ContainerStarted","Data":"7b65bc97777b9c4df6db2d3194cff63873dffd8b9950b908e90d09ed9239ae1b"} May 06 17:11:35.132487 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:35.132423 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4cnzp" podStartSLOduration=66.568462407 podStartE2EDuration="1m8.132401967s" podCreationTimestamp="2026-05-06 17:10:27 +0000 UTC" firstStartedPulling="2026-05-06 17:11:32.939274791 +0000 UTC m=+66.580672367" lastFinishedPulling="2026-05-06 17:11:34.503214347 +0000 UTC m=+68.144611927" observedRunningTime="2026-05-06 17:11:35.131541683 +0000 UTC m=+68.772939281" watchObservedRunningTime="2026-05-06 17:11:35.132401967 +0000 UTC m=+68.773799568" May 06 17:11:37.118192 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:37.118156 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-k855j" event={"ID":"c55e3ee0-9cb2-41dc-9751-75fb3367d774","Type":"ContainerStarted","Data":"f3e3d46170bb8433a4bd71e999077b098e5fb84021513a20b6caa751bdceddf7"} May 06 17:11:37.118660 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:37.118284 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:11:37.136558 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:37.136510 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-k855j" podStartSLOduration=67.055065145 podStartE2EDuration="1m10.136493603s" podCreationTimestamp="2026-05-06 17:10:27 +0000 UTC" firstStartedPulling="2026-05-06 17:11:33.242946083 +0000 UTC m=+66.884343660" lastFinishedPulling="2026-05-06 17:11:36.324374539 +0000 UTC m=+69.965772118" observedRunningTime="2026-05-06 17:11:37.134806999 +0000 UTC m=+70.776204726" watchObservedRunningTime="2026-05-06 17:11:37.136493603 +0000 UTC m=+70.777891200" May 06 17:11:39.125257 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:39.125217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wvb5c" event={"ID":"8a7c9e10-c274-46eb-b020-deee09868a53","Type":"ContainerStarted","Data":"920dee759d55848d9c80cc8e796d0a02cd32bde102aea9b63b53e338b917070e"} May 06 17:11:39.143968 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:39.143922 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wvb5c" podStartSLOduration=65.502081942 podStartE2EDuration="1m10.14390648s" podCreationTimestamp="2026-05-06 17:10:29 +0000 UTC" firstStartedPulling="2026-05-06 17:11:33.545376414 +0000 UTC m=+67.186773989" lastFinishedPulling="2026-05-06 17:11:38.187200947 +0000 UTC m=+71.828598527" observedRunningTime="2026-05-06 17:11:39.142757736 +0000 UTC m=+72.784155334" watchObservedRunningTime="2026-05-06 17:11:39.14390648 +0000 UTC m=+72.785304078" May 06 17:11:42.089781 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:42.089754 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:47.104217 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.104090 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-68c565b89b-trfwd" podUID="3954deed-cfd3-49d8-8c8a-f057a70b5e6b" containerName="registry" containerID="cri-o://10cb86e598b92204184518019c1a73d960edcedfb3e20032cdd98da47894beb2" gracePeriod=30 May 06 17:11:47.340189 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.340169 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:47.421884 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.421862 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-installation-pull-secrets\") pod \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " May 06 17:11:47.422057 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.421890 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls\") pod \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " May 06 17:11:47.422057 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.421941 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9b2h\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-kube-api-access-l9b2h\") pod \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " May 06 17:11:47.422057 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.422045 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-certificates\") pod \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " May 06 17:11:47.422224 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.422085 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-ca-trust-extracted\") pod \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " May 06 17:11:47.422224 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.422137 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-bound-sa-token\") pod \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " May 06 17:11:47.422224 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.422168 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-image-registry-private-configuration\") pod \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " May 06 17:11:47.422224 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.422209 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-trusted-ca\") pod \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\" (UID: \"3954deed-cfd3-49d8-8c8a-f057a70b5e6b\") " May 06 17:11:47.422620 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.422566 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3954deed-cfd3-49d8-8c8a-f057a70b5e6b" (UID: "3954deed-cfd3-49d8-8c8a-f057a70b5e6b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:11:47.422863 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.422774 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3954deed-cfd3-49d8-8c8a-f057a70b5e6b" (UID: "3954deed-cfd3-49d8-8c8a-f057a70b5e6b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:11:47.424411 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.424363 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3954deed-cfd3-49d8-8c8a-f057a70b5e6b" (UID: "3954deed-cfd3-49d8-8c8a-f057a70b5e6b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:11:47.424504 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.424461 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3954deed-cfd3-49d8-8c8a-f057a70b5e6b" (UID: "3954deed-cfd3-49d8-8c8a-f057a70b5e6b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:11:47.424651 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.424625 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-kube-api-access-l9b2h" (OuterVolumeSpecName: "kube-api-access-l9b2h") pod "3954deed-cfd3-49d8-8c8a-f057a70b5e6b" (UID: "3954deed-cfd3-49d8-8c8a-f057a70b5e6b"). InnerVolumeSpecName "kube-api-access-l9b2h". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:11:47.424722 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.424660 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3954deed-cfd3-49d8-8c8a-f057a70b5e6b" (UID: "3954deed-cfd3-49d8-8c8a-f057a70b5e6b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:11:47.424865 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.424839 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3954deed-cfd3-49d8-8c8a-f057a70b5e6b" (UID: "3954deed-cfd3-49d8-8c8a-f057a70b5e6b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:11:47.430732 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.430708 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3954deed-cfd3-49d8-8c8a-f057a70b5e6b" (UID: "3954deed-cfd3-49d8-8c8a-f057a70b5e6b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 06 17:11:47.523045 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.523014 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-ca-trust-extracted\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:11:47.523045 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.523040 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-bound-sa-token\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:11:47.523045 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.523052 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-image-registry-private-configuration\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:11:47.523262 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.523062 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-trusted-ca\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:11:47.523262 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.523072 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-installation-pull-secrets\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:11:47.523262 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.523081 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-tls\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:11:47.523262 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.523089 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l9b2h\" (UniqueName: \"kubernetes.io/projected/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-kube-api-access-l9b2h\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:11:47.523262 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.523098 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3954deed-cfd3-49d8-8c8a-f057a70b5e6b-registry-certificates\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:11:47.559909 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.559884 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f89d694d9-lhbr9"] May 06 17:11:47.560109 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.560097 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3954deed-cfd3-49d8-8c8a-f057a70b5e6b" containerName="registry" May 06 17:11:47.560154 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.560111 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3954deed-cfd3-49d8-8c8a-f057a70b5e6b" containerName="registry" May 06 17:11:47.560188 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.560156 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3954deed-cfd3-49d8-8c8a-f057a70b5e6b" containerName="registry" May 06 17:11:47.564591 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.564565 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.567816 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.567790 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" May 06 17:11:47.567816 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.567818 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" May 06 17:11:47.568132 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.568114 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" May 06 17:11:47.568231 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.568217 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" May 06 17:11:47.568292 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.568233 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" May 06 17:11:47.568375 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.568310 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gqzbr\"" May 06 17:11:47.568375 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.568333 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" May 06 17:11:47.568512 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.568500 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" May 06 17:11:47.573716 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.573698 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f89d694d9-lhbr9"] May 06 17:11:47.624075 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.624051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcad3307-adba-45cf-a174-5561ce168948-console-serving-cert\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.624215 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.624081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-console-config\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.624215 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.624098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-service-ca\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.624215 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.624121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-oauth-serving-cert\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.624215 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.624172 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jbf5\" (UniqueName: \"kubernetes.io/projected/dcad3307-adba-45cf-a174-5561ce168948-kube-api-access-4jbf5\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.624215 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.624188 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcad3307-adba-45cf-a174-5561ce168948-console-oauth-config\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.724507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.724428 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcad3307-adba-45cf-a174-5561ce168948-console-serving-cert\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.724507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.724467 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-console-config\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.724507 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.724487 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-service-ca\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.724788 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.724640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-oauth-serving-cert\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.724788 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.724697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jbf5\" (UniqueName: \"kubernetes.io/projected/dcad3307-adba-45cf-a174-5561ce168948-kube-api-access-4jbf5\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.724788 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.724727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcad3307-adba-45cf-a174-5561ce168948-console-oauth-config\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.725201 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.725177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-service-ca\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.725306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.725207 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-console-config\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.725368 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.725348 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-oauth-serving-cert\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.726891 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.726874 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcad3307-adba-45cf-a174-5561ce168948-console-oauth-config\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.726971 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.726957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcad3307-adba-45cf-a174-5561ce168948-console-serving-cert\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.733299 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.733282 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jbf5\" (UniqueName: \"kubernetes.io/projected/dcad3307-adba-45cf-a174-5561ce168948-kube-api-access-4jbf5\") pod \"console-f89d694d9-lhbr9\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.874160 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.874123 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:47.994005 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:47.993918 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f89d694d9-lhbr9"] May 06 17:11:47.999761 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:11:47.999734 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcad3307_adba_45cf_a174_5561ce168948.slice/crio-2ba302c3ae88064afdca223ec5a8f5da59aa920c7b4b30955374f9cec169ef5e WatchSource:0}: Error finding container 2ba302c3ae88064afdca223ec5a8f5da59aa920c7b4b30955374f9cec169ef5e: Status 404 returned error can't find the container with id 2ba302c3ae88064afdca223ec5a8f5da59aa920c7b4b30955374f9cec169ef5e May 06 17:11:48.147847 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:48.147810 2578 generic.go:358] "Generic (PLEG): container finished" podID="3954deed-cfd3-49d8-8c8a-f057a70b5e6b" containerID="10cb86e598b92204184518019c1a73d960edcedfb3e20032cdd98da47894beb2" exitCode=0 May 06 17:11:48.148281 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:48.147857 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68c565b89b-trfwd" event={"ID":"3954deed-cfd3-49d8-8c8a-f057a70b5e6b","Type":"ContainerDied","Data":"10cb86e598b92204184518019c1a73d960edcedfb3e20032cdd98da47894beb2"} May 06 17:11:48.148281 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:48.147897 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68c565b89b-trfwd" event={"ID":"3954deed-cfd3-49d8-8c8a-f057a70b5e6b","Type":"ContainerDied","Data":"aecac6a1da54446e59ca89c6b2e2ea4099d9c91b4c8606e7ad82dc84c1bcbd43"} May 06 17:11:48.148281 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:48.147899 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68c565b89b-trfwd" May 06 17:11:48.148281 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:48.147915 2578 scope.go:117] "RemoveContainer" containerID="10cb86e598b92204184518019c1a73d960edcedfb3e20032cdd98da47894beb2" May 06 17:11:48.149251 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:48.149049 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f89d694d9-lhbr9" event={"ID":"dcad3307-adba-45cf-a174-5561ce168948","Type":"ContainerStarted","Data":"2ba302c3ae88064afdca223ec5a8f5da59aa920c7b4b30955374f9cec169ef5e"} May 06 17:11:48.155423 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:48.155405 2578 scope.go:117] "RemoveContainer" containerID="10cb86e598b92204184518019c1a73d960edcedfb3e20032cdd98da47894beb2" May 06 17:11:48.155798 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:11:48.155710 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10cb86e598b92204184518019c1a73d960edcedfb3e20032cdd98da47894beb2\": container with ID starting with 10cb86e598b92204184518019c1a73d960edcedfb3e20032cdd98da47894beb2 not found: ID does not exist" containerID="10cb86e598b92204184518019c1a73d960edcedfb3e20032cdd98da47894beb2" May 06 17:11:48.155798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:48.155744 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cb86e598b92204184518019c1a73d960edcedfb3e20032cdd98da47894beb2"} err="failed to get container status \"10cb86e598b92204184518019c1a73d960edcedfb3e20032cdd98da47894beb2\": rpc error: code = NotFound desc = could not find container \"10cb86e598b92204184518019c1a73d960edcedfb3e20032cdd98da47894beb2\": container with ID starting with 10cb86e598b92204184518019c1a73d960edcedfb3e20032cdd98da47894beb2 not found: ID does not exist" May 06 17:11:48.169954 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:48.169931 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68c565b89b-trfwd"] May 06 17:11:48.173690 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:48.173671 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-68c565b89b-trfwd"] May 06 17:11:48.883646 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:48.883612 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3954deed-cfd3-49d8-8c8a-f057a70b5e6b" path="/var/lib/kubelet/pods/3954deed-cfd3-49d8-8c8a-f057a70b5e6b/volumes" May 06 17:11:49.230159 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.230128 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-gmq9q"] May 06 17:11:49.233403 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.233378 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-gmq9q" May 06 17:11:49.236042 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.236017 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" May 06 17:11:49.236153 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.236054 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-h78k4\"" May 06 17:11:49.241878 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.241165 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-gmq9q"] May 06 17:11:49.336700 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.336666 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b853761b-55f0-41f4-a922-c8f1d5fdfc58-tls-certificates\") pod \"prometheus-operator-admission-webhook-64b84d7657-gmq9q\" (UID: \"b853761b-55f0-41f4-a922-c8f1d5fdfc58\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-gmq9q" May 06 17:11:49.367329 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.367278 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7677d75f5b-t2pqp"] May 06 17:11:49.370487 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.370463 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.378444 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.378422 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" May 06 17:11:49.380572 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.380531 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7677d75f5b-t2pqp"] May 06 17:11:49.437642 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.437607 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-serving-cert\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.437819 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.437655 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-oauth-serving-cert\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.437819 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.437688 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-service-ca\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.437819 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.437718 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-trusted-ca-bundle\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.437819 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.437749 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-oauth-config\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.437819 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.437797 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z4jv\" (UniqueName: \"kubernetes.io/projected/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-kube-api-access-8z4jv\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.438030 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.437819 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-config\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.438030 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.437864 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b853761b-55f0-41f4-a922-c8f1d5fdfc58-tls-certificates\") pod \"prometheus-operator-admission-webhook-64b84d7657-gmq9q\" (UID: \"b853761b-55f0-41f4-a922-c8f1d5fdfc58\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-gmq9q" May 06 17:11:49.440565 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.440539 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b853761b-55f0-41f4-a922-c8f1d5fdfc58-tls-certificates\") pod \"prometheus-operator-admission-webhook-64b84d7657-gmq9q\" (UID: \"b853761b-55f0-41f4-a922-c8f1d5fdfc58\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-gmq9q" May 06 17:11:49.538845 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.538685 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-trusted-ca-bundle\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.538845 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.538736 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-oauth-config\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.538845 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.538767 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8z4jv\" (UniqueName: \"kubernetes.io/projected/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-kube-api-access-8z4jv\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.538845 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.538790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-config\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.538845 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.538848 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-serving-cert\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.539182 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.538893 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-oauth-serving-cert\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.539182 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.538923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-service-ca\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.539580 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.539555 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-config\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.539717 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.539573 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-service-ca\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.539769 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.539745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-trusted-ca-bundle\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.540038 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.540018 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-oauth-serving-cert\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.541657 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.541636 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-serving-cert\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.541657 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.541647 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-oauth-config\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.544370 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.544351 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-gmq9q" May 06 17:11:49.546931 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.546907 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z4jv\" (UniqueName: \"kubernetes.io/projected/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-kube-api-access-8z4jv\") pod \"console-7677d75f5b-t2pqp\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.682188 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.682154 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:49.686406 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.686248 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-gmq9q"] May 06 17:11:49.693620 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:11:49.693446 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb853761b_55f0_41f4_a922_c8f1d5fdfc58.slice/crio-645a80668842ebbf12edaf9c0b17b56232e9cedfa968f5d65dbeb384b753b5ab WatchSource:0}: Error finding container 645a80668842ebbf12edaf9c0b17b56232e9cedfa968f5d65dbeb384b753b5ab: Status 404 returned error can't find the container with id 645a80668842ebbf12edaf9c0b17b56232e9cedfa968f5d65dbeb384b753b5ab May 06 17:11:49.846517 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:49.846378 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7677d75f5b-t2pqp"] May 06 17:11:49.849175 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:11:49.849145 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f494aed_09f3_4cc4_9eca_7c79c5fd611b.slice/crio-722c05b218f96c49d4eb7e8ebfeacd2f3f1b34d205421442a5129b8c23dc6e6d WatchSource:0}: Error finding container 722c05b218f96c49d4eb7e8ebfeacd2f3f1b34d205421442a5129b8c23dc6e6d: Status 404 returned error can't find the container with id 722c05b218f96c49d4eb7e8ebfeacd2f3f1b34d205421442a5129b8c23dc6e6d May 06 17:11:50.156698 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:50.156606 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-gmq9q" event={"ID":"b853761b-55f0-41f4-a922-c8f1d5fdfc58","Type":"ContainerStarted","Data":"645a80668842ebbf12edaf9c0b17b56232e9cedfa968f5d65dbeb384b753b5ab"} May 06 17:11:50.157813 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:50.157787 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7677d75f5b-t2pqp" event={"ID":"2f494aed-09f3-4cc4-9eca-7c79c5fd611b","Type":"ContainerStarted","Data":"722c05b218f96c49d4eb7e8ebfeacd2f3f1b34d205421442a5129b8c23dc6e6d"} May 06 17:11:52.165154 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:52.165065 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7677d75f5b-t2pqp" event={"ID":"2f494aed-09f3-4cc4-9eca-7c79c5fd611b","Type":"ContainerStarted","Data":"c34f7e52f10f566a7a8a2891216d076345ce7ea405f21c8f6f2efa76eb76355d"} May 06 17:11:52.166411 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:52.166385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f89d694d9-lhbr9" event={"ID":"dcad3307-adba-45cf-a174-5561ce168948","Type":"ContainerStarted","Data":"54aa260b3fef6862c5d0f5e88794cab69ee12f14c7c3e9d76bcff443d8229b5e"} May 06 17:11:52.167541 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:52.167522 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-gmq9q" event={"ID":"b853761b-55f0-41f4-a922-c8f1d5fdfc58","Type":"ContainerStarted","Data":"fdcd233109fab68b9a76fca888c42bf668f30c912612b729cb2a683d37f4ade4"} May 06 17:11:52.167737 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:52.167720 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-gmq9q" May 06 17:11:52.172438 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:52.172419 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-gmq9q" May 06 17:11:52.186438 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:52.186397 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7677d75f5b-t2pqp" podStartSLOduration=1.800948352 podStartE2EDuration="3.186388362s" podCreationTimestamp="2026-05-06 17:11:49 +0000 UTC" firstStartedPulling="2026-05-06 17:11:49.851337559 +0000 UTC m=+83.492735138" lastFinishedPulling="2026-05-06 17:11:51.236777572 +0000 UTC m=+84.878175148" observedRunningTime="2026-05-06 17:11:52.185223925 +0000 UTC m=+85.826621524" watchObservedRunningTime="2026-05-06 17:11:52.186388362 +0000 UTC m=+85.827785954" May 06 17:11:52.202446 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:52.202405 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-64b84d7657-gmq9q" podStartSLOduration=1.101976544 podStartE2EDuration="3.202393685s" podCreationTimestamp="2026-05-06 17:11:49 +0000 UTC" firstStartedPulling="2026-05-06 17:11:49.696550213 +0000 UTC m=+83.337947808" lastFinishedPulling="2026-05-06 17:11:51.796967373 +0000 UTC m=+85.438364949" observedRunningTime="2026-05-06 17:11:52.201571938 +0000 UTC m=+85.842969536" watchObservedRunningTime="2026-05-06 17:11:52.202393685 +0000 UTC m=+85.843791282" May 06 17:11:52.224047 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:52.223995 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f89d694d9-lhbr9" podStartSLOduration=1.989484313 podStartE2EDuration="5.223978397s" podCreationTimestamp="2026-05-06 17:11:47 +0000 UTC" firstStartedPulling="2026-05-06 17:11:48.001495165 +0000 UTC m=+81.642892746" lastFinishedPulling="2026-05-06 17:11:51.23598924 +0000 UTC m=+84.877386830" observedRunningTime="2026-05-06 17:11:52.222811733 +0000 UTC m=+85.864209455" watchObservedRunningTime="2026-05-06 17:11:52.223978397 +0000 UTC m=+85.865375996" May 06 17:11:57.875290 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:57.875251 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:57.875791 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:57.875425 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:57.880043 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:57.880019 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:58.186863 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.186837 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:11:58.671900 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.671869 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8"] May 06 17:11:58.675029 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.675012 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:58.679579 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.679559 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" May 06 17:11:58.679895 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.679877 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" May 06 17:11:58.680098 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.680085 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" May 06 17:11:58.680479 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.680460 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" May 06 17:11:58.680612 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.680576 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-q2tmr\"" May 06 17:11:58.680687 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.680580 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" May 06 17:11:58.687357 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.687337 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8"] May 06 17:11:58.728029 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.728006 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-h7dtw"] May 06 17:11:58.729927 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.729912 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.733240 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.733220 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" May 06 17:11:58.733402 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.733388 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" May 06 17:11:58.733627 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.733612 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" May 06 17:11:58.733766 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.733750 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7k9wf\"" May 06 17:11:58.803042 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.803013 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd6xl\" (UniqueName: \"kubernetes.io/projected/70a32e35-d08a-40c8-a501-a2edd6df74e6-kube-api-access-qd6xl\") pod \"openshift-state-metrics-5cc99f7c99-rzxd8\" (UID: \"70a32e35-d08a-40c8-a501-a2edd6df74e6\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:58.803181 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.803046 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.803181 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.803072 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-tls\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.803181 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.803132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/70a32e35-d08a-40c8-a501-a2edd6df74e6-metrics-client-ca\") pod \"openshift-state-metrics-5cc99f7c99-rzxd8\" (UID: \"70a32e35-d08a-40c8-a501-a2edd6df74e6\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:58.803282 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.803207 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95pdp\" (UniqueName: \"kubernetes.io/projected/4a629322-a916-4ef9-a2a3-40536587a62d-kube-api-access-95pdp\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.803282 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.803230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-wtmp\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.803282 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.803252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a629322-a916-4ef9-a2a3-40536587a62d-metrics-client-ca\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.803282 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.803271 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-accelerators-collector-config\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.803400 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.803301 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a32e35-d08a-40c8-a501-a2edd6df74e6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5cc99f7c99-rzxd8\" (UID: \"70a32e35-d08a-40c8-a501-a2edd6df74e6\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:58.803400 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.803324 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/70a32e35-d08a-40c8-a501-a2edd6df74e6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5cc99f7c99-rzxd8\" (UID: \"70a32e35-d08a-40c8-a501-a2edd6df74e6\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:58.803400 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.803340 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a629322-a916-4ef9-a2a3-40536587a62d-root\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.803400 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.803369 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a629322-a916-4ef9-a2a3-40536587a62d-sys\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.803520 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.803405 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-textfile\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.904069 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904036 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a629322-a916-4ef9-a2a3-40536587a62d-sys\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.904069 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904071 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-textfile\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd6xl\" (UniqueName: \"kubernetes.io/projected/70a32e35-d08a-40c8-a501-a2edd6df74e6-kube-api-access-qd6xl\") pod \"openshift-state-metrics-5cc99f7c99-rzxd8\" (UID: \"70a32e35-d08a-40c8-a501-a2edd6df74e6\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904115 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-tls\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/70a32e35-d08a-40c8-a501-a2edd6df74e6-metrics-client-ca\") pod \"openshift-state-metrics-5cc99f7c99-rzxd8\" (UID: \"70a32e35-d08a-40c8-a501-a2edd6df74e6\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904152 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a629322-a916-4ef9-a2a3-40536587a62d-sys\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95pdp\" (UniqueName: \"kubernetes.io/projected/4a629322-a916-4ef9-a2a3-40536587a62d-kube-api-access-95pdp\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904226 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-wtmp\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904256 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a629322-a916-4ef9-a2a3-40536587a62d-metrics-client-ca\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904285 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-accelerators-collector-config\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904317 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a32e35-d08a-40c8-a501-a2edd6df74e6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5cc99f7c99-rzxd8\" (UID: \"70a32e35-d08a-40c8-a501-a2edd6df74e6\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/70a32e35-d08a-40c8-a501-a2edd6df74e6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5cc99f7c99-rzxd8\" (UID: \"70a32e35-d08a-40c8-a501-a2edd6df74e6\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a629322-a916-4ef9-a2a3-40536587a62d-root\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904387 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-wtmp\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.904566 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904427 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-textfile\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.905281 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904701 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a629322-a916-4ef9-a2a3-40536587a62d-root\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.905281 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.904994 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-accelerators-collector-config\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.905281 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.905066 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/70a32e35-d08a-40c8-a501-a2edd6df74e6-metrics-client-ca\") pod \"openshift-state-metrics-5cc99f7c99-rzxd8\" (UID: \"70a32e35-d08a-40c8-a501-a2edd6df74e6\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:58.905281 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.905074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a629322-a916-4ef9-a2a3-40536587a62d-metrics-client-ca\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.906859 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.906836 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.906995 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.906971 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a629322-a916-4ef9-a2a3-40536587a62d-node-exporter-tls\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.907300 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.907280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a32e35-d08a-40c8-a501-a2edd6df74e6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5cc99f7c99-rzxd8\" (UID: \"70a32e35-d08a-40c8-a501-a2edd6df74e6\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:58.907720 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.907699 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/70a32e35-d08a-40c8-a501-a2edd6df74e6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5cc99f7c99-rzxd8\" (UID: \"70a32e35-d08a-40c8-a501-a2edd6df74e6\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:58.912044 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.912017 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95pdp\" (UniqueName: \"kubernetes.io/projected/4a629322-a916-4ef9-a2a3-40536587a62d-kube-api-access-95pdp\") pod \"node-exporter-h7dtw\" (UID: \"4a629322-a916-4ef9-a2a3-40536587a62d\") " pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:58.912202 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.912175 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd6xl\" (UniqueName: \"kubernetes.io/projected/70a32e35-d08a-40c8-a501-a2edd6df74e6-kube-api-access-qd6xl\") pod \"openshift-state-metrics-5cc99f7c99-rzxd8\" (UID: \"70a32e35-d08a-40c8-a501-a2edd6df74e6\") " pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:58.983790 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:58.983725 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" May 06 17:11:59.038634 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.038605 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h7dtw" May 06 17:11:59.049125 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:11:59.049013 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a629322_a916_4ef9_a2a3_40536587a62d.slice/crio-8de86a31bf225490b8805794b9745a957e4e24f49fa21ff7fda140ee265d4cd3 WatchSource:0}: Error finding container 8de86a31bf225490b8805794b9745a957e4e24f49fa21ff7fda140ee265d4cd3: Status 404 returned error can't find the container with id 8de86a31bf225490b8805794b9745a957e4e24f49fa21ff7fda140ee265d4cd3 May 06 17:11:59.113890 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.113867 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8"] May 06 17:11:59.115975 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:11:59.115950 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70a32e35_d08a_40c8_a501_a2edd6df74e6.slice/crio-12c7fc6c51ea8a6746dac6586ccd1e298083c75e11f36cfd6e3b7f819a565184 WatchSource:0}: Error finding container 12c7fc6c51ea8a6746dac6586ccd1e298083c75e11f36cfd6e3b7f819a565184: Status 404 returned error can't find the container with id 12c7fc6c51ea8a6746dac6586ccd1e298083c75e11f36cfd6e3b7f819a565184 May 06 17:11:59.187559 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.187530 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h7dtw" event={"ID":"4a629322-a916-4ef9-a2a3-40536587a62d","Type":"ContainerStarted","Data":"8de86a31bf225490b8805794b9745a957e4e24f49fa21ff7fda140ee265d4cd3"} May 06 17:11:59.188933 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.188911 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" event={"ID":"70a32e35-d08a-40c8-a501-a2edd6df74e6","Type":"ContainerStarted","Data":"68a19de37a443e13a04785d713f0affefa2afb2ae3d36ba09a27811434bccb67"} May 06 17:11:59.189016 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.188945 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" event={"ID":"70a32e35-d08a-40c8-a501-a2edd6df74e6","Type":"ContainerStarted","Data":"12c7fc6c51ea8a6746dac6586ccd1e298083c75e11f36cfd6e3b7f819a565184"} May 06 17:11:59.682542 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.682506 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:59.682731 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.682592 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:59.687603 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.687563 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:11:59.758975 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.758941 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] May 06 17:11:59.761844 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.761824 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" May 06 17:11:59.764404 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.764376 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" May 06 17:11:59.764522 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.764446 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" May 06 17:11:59.764522 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.764467 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" May 06 17:11:59.764806 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.764785 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" May 06 17:11:59.764918 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.764820 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" May 06 17:11:59.764918 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.764821 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" May 06 17:11:59.765607 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.765524 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" May 06 17:11:59.765718 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.765634 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-nh55k\"" May 06 17:11:59.765718 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.765655 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" May 06 17:11:59.770561 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.770538 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" May 06 17:11:59.780201 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.780179 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] May 06 17:11:59.913721 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.913688 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7ppr\" (UniqueName: \"kubernetes.io/projected/d26ad6c1-538b-46c9-beb2-48279a7382cd-kube-api-access-w7ppr\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:11:59.914306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.913738 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:11:59.914306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.913765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26ad6c1-538b-46c9-beb2-48279a7382cd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:11:59.914306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.913794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-config-volume\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:11:59.914306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.913824 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:11:59.914306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.913846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:11:59.914306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.913944 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d26ad6c1-538b-46c9-beb2-48279a7382cd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:11:59.914306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.913984 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:11:59.914306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.914026 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d26ad6c1-538b-46c9-beb2-48279a7382cd-config-out\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:11:59.914306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.914068 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d26ad6c1-538b-46c9-beb2-48279a7382cd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:11:59.914306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.914103 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-web-config\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:11:59.914306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.914128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d26ad6c1-538b-46c9-beb2-48279a7382cd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:11:59.914306 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:11:59.914147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.015469 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.015444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.015608 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.015483 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.015608 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.015523 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d26ad6c1-538b-46c9-beb2-48279a7382cd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.015608 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.015555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.015608 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:12:00.015592 2578 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found May 06 17:12:00.015809 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:12:00.015668 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-main-tls podName:d26ad6c1-538b-46c9-beb2-48279a7382cd nodeName:}" failed. No retries permitted until 2026-05-06 17:12:00.515645804 +0000 UTC m=+94.157043384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "d26ad6c1-538b-46c9-beb2-48279a7382cd") : secret "alertmanager-main-tls" not found May 06 17:12:00.015809 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.015693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d26ad6c1-538b-46c9-beb2-48279a7382cd-config-out\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.015809 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.015729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d26ad6c1-538b-46c9-beb2-48279a7382cd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.015809 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.015760 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-web-config\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.016010 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.015829 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d26ad6c1-538b-46c9-beb2-48279a7382cd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.016010 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.015861 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.016010 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.015906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7ppr\" (UniqueName: \"kubernetes.io/projected/d26ad6c1-538b-46c9-beb2-48279a7382cd-kube-api-access-w7ppr\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.016010 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.015939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.016010 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.015959 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26ad6c1-538b-46c9-beb2-48279a7382cd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.016010 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.015992 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-config-volume\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.016933 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.016602 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d26ad6c1-538b-46c9-beb2-48279a7382cd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.017500 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.017303 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d26ad6c1-538b-46c9-beb2-48279a7382cd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.017750 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.017729 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26ad6c1-538b-46c9-beb2-48279a7382cd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.018724 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.018702 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.018927 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.018904 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.020049 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.020025 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-config-volume\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.020474 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.020424 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.020564 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.020501 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.020755 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.020728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d26ad6c1-538b-46c9-beb2-48279a7382cd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.021411 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.021378 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-web-config\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.022013 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.021983 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d26ad6c1-538b-46c9-beb2-48279a7382cd-config-out\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.029515 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.029491 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7ppr\" (UniqueName: \"kubernetes.io/projected/d26ad6c1-538b-46c9-beb2-48279a7382cd-kube-api-access-w7ppr\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.196077 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.196039 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" event={"ID":"70a32e35-d08a-40c8-a501-a2edd6df74e6","Type":"ContainerStarted","Data":"53f0ebdcb944c7f0eae2975926cced65a4b9fe6065ced5863dfeb28225af1c74"} May 06 17:12:00.197705 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.197676 2578 generic.go:358] "Generic (PLEG): container finished" podID="4a629322-a916-4ef9-a2a3-40536587a62d" containerID="3388d1295827fbfd81d34166241eef9aa929cebfaa8ff43a49b6baddb6137bbb" exitCode=0 May 06 17:12:00.197828 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.197757 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h7dtw" event={"ID":"4a629322-a916-4ef9-a2a3-40536587a62d","Type":"ContainerDied","Data":"3388d1295827fbfd81d34166241eef9aa929cebfaa8ff43a49b6baddb6137bbb"} May 06 17:12:00.202205 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.202185 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:12:00.295608 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.295514 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f89d694d9-lhbr9"] May 06 17:12:00.521067 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.521040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.523297 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.523274 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d26ad6c1-538b-46c9-beb2-48279a7382cd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d26ad6c1-538b-46c9-beb2-48279a7382cd\") " pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.673503 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.673473 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" May 06 17:12:00.836749 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:00.836711 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] May 06 17:12:00.850239 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:12:00.850211 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26ad6c1_538b_46c9_beb2_48279a7382cd.slice/crio-d9e846b1d74ed38d02b06d5dcd8cb2590bce9d7f242ae2bcece9b11d421ee341 WatchSource:0}: Error finding container d9e846b1d74ed38d02b06d5dcd8cb2590bce9d7f242ae2bcece9b11d421ee341: Status 404 returned error can't find the container with id d9e846b1d74ed38d02b06d5dcd8cb2590bce9d7f242ae2bcece9b11d421ee341 May 06 17:12:01.201293 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:01.201257 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d26ad6c1-538b-46c9-beb2-48279a7382cd","Type":"ContainerStarted","Data":"d9e846b1d74ed38d02b06d5dcd8cb2590bce9d7f242ae2bcece9b11d421ee341"} May 06 17:12:01.203038 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:01.203015 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" event={"ID":"70a32e35-d08a-40c8-a501-a2edd6df74e6","Type":"ContainerStarted","Data":"c2ccd92336be9df734a3b2325538613ae7de0effa817e24c3d7bd05807ed8a03"} May 06 17:12:01.204714 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:01.204692 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h7dtw" event={"ID":"4a629322-a916-4ef9-a2a3-40536587a62d","Type":"ContainerStarted","Data":"06e0099f91204b87ce4df6a66fc087d4ca7b61153a7c893bba07f57cac3030c1"} May 06 17:12:01.204714 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:01.204717 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h7dtw" event={"ID":"4a629322-a916-4ef9-a2a3-40536587a62d","Type":"ContainerStarted","Data":"f29ce253af265069d720fc95d34888f8ca2854262379f891d32e1eef3906d1d4"} May 06 17:12:01.223041 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:01.223002 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5cc99f7c99-rzxd8" podStartSLOduration=2.025976338 podStartE2EDuration="3.222990736s" podCreationTimestamp="2026-05-06 17:11:58 +0000 UTC" firstStartedPulling="2026-05-06 17:11:59.229611607 +0000 UTC m=+92.871009183" lastFinishedPulling="2026-05-06 17:12:00.426625992 +0000 UTC m=+94.068023581" observedRunningTime="2026-05-06 17:12:01.222323433 +0000 UTC m=+94.863721056" watchObservedRunningTime="2026-05-06 17:12:01.222990736 +0000 UTC m=+94.864388355" May 06 17:12:01.247356 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:01.247316 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-h7dtw" podStartSLOduration=2.328241488 podStartE2EDuration="3.247304937s" podCreationTimestamp="2026-05-06 17:11:58 +0000 UTC" firstStartedPulling="2026-05-06 17:11:59.051341711 +0000 UTC m=+92.692739287" lastFinishedPulling="2026-05-06 17:11:59.970405142 +0000 UTC m=+93.611802736" observedRunningTime="2026-05-06 17:12:01.245751221 +0000 UTC m=+94.887148820" watchObservedRunningTime="2026-05-06 17:12:01.247304937 +0000 UTC m=+94.888702588" May 06 17:12:02.212911 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:02.212874 2578 generic.go:358] "Generic (PLEG): container finished" podID="d26ad6c1-538b-46c9-beb2-48279a7382cd" containerID="e00905d6c73a9221400375a366183e2321cc4b14d620f39e5c59df22cc10aeab" exitCode=0 May 06 17:12:02.213369 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:02.212954 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d26ad6c1-538b-46c9-beb2-48279a7382cd","Type":"ContainerDied","Data":"e00905d6c73a9221400375a366183e2321cc4b14d620f39e5c59df22cc10aeab"} May 06 17:12:03.217014 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.216981 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-667d5c986-57xjw"] May 06 17:12:03.218981 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.218966 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.223316 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.223284 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-hp9md\"" May 06 17:12:03.223434 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.223294 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" May 06 17:12:03.223434 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.223341 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dmo3jhlc3cmhc\"" May 06 17:12:03.223434 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.223370 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" May 06 17:12:03.223434 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.223381 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" May 06 17:12:03.223434 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.223387 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" May 06 17:12:03.229435 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.229410 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-667d5c986-57xjw"] May 06 17:12:03.345262 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.345230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5ddabe88-c951-48de-9c9d-61a43cde5935-secret-metrics-server-client-certs\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.345434 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.345286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ddabe88-c951-48de-9c9d-61a43cde5935-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.345434 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.345339 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5ddabe88-c951-48de-9c9d-61a43cde5935-audit-log\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.345553 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.345440 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5ddabe88-c951-48de-9c9d-61a43cde5935-metrics-server-audit-profiles\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.345553 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.345491 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68whb\" (UniqueName: \"kubernetes.io/projected/5ddabe88-c951-48de-9c9d-61a43cde5935-kube-api-access-68whb\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.345658 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.345549 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ddabe88-c951-48de-9c9d-61a43cde5935-client-ca-bundle\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.345658 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.345600 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5ddabe88-c951-48de-9c9d-61a43cde5935-secret-metrics-server-tls\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.446851 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.446814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5ddabe88-c951-48de-9c9d-61a43cde5935-metrics-server-audit-profiles\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.446998 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.446870 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68whb\" (UniqueName: \"kubernetes.io/projected/5ddabe88-c951-48de-9c9d-61a43cde5935-kube-api-access-68whb\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.446998 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.446906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ddabe88-c951-48de-9c9d-61a43cde5935-client-ca-bundle\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.446998 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.446941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5ddabe88-c951-48de-9c9d-61a43cde5935-secret-metrics-server-tls\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.447167 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.446999 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5ddabe88-c951-48de-9c9d-61a43cde5935-secret-metrics-server-client-certs\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.447167 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.447040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ddabe88-c951-48de-9c9d-61a43cde5935-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.447167 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.447066 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5ddabe88-c951-48de-9c9d-61a43cde5935-audit-log\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.447518 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.447472 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5ddabe88-c951-48de-9c9d-61a43cde5935-audit-log\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.447890 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.447841 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ddabe88-c951-48de-9c9d-61a43cde5935-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.447975 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.447925 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5ddabe88-c951-48de-9c9d-61a43cde5935-metrics-server-audit-profiles\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.449766 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.449748 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5ddabe88-c951-48de-9c9d-61a43cde5935-secret-metrics-server-tls\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.449865 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.449763 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ddabe88-c951-48de-9c9d-61a43cde5935-client-ca-bundle\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.450005 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.449989 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5ddabe88-c951-48de-9c9d-61a43cde5935-secret-metrics-server-client-certs\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.454961 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.454941 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68whb\" (UniqueName: \"kubernetes.io/projected/5ddabe88-c951-48de-9c9d-61a43cde5935-kube-api-access-68whb\") pod \"metrics-server-667d5c986-57xjw\" (UID: \"5ddabe88-c951-48de-9c9d-61a43cde5935\") " pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.459724 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.459701 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-655d88fc6c-6fkl6"] May 06 17:12:03.462780 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.462763 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6fkl6" May 06 17:12:03.465493 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.465472 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" May 06 17:12:03.465839 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.465820 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-jstrq\"" May 06 17:12:03.472759 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.472704 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-655d88fc6c-6fkl6"] May 06 17:12:03.528277 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.528244 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:03.548361 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.548330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a55e3b09-195e-4c75-9243-b827d82be012-monitoring-plugin-cert\") pod \"monitoring-plugin-655d88fc6c-6fkl6\" (UID: \"a55e3b09-195e-4c75-9243-b827d82be012\") " pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6fkl6" May 06 17:12:03.648972 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.648942 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a55e3b09-195e-4c75-9243-b827d82be012-monitoring-plugin-cert\") pod \"monitoring-plugin-655d88fc6c-6fkl6\" (UID: \"a55e3b09-195e-4c75-9243-b827d82be012\") " pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6fkl6" May 06 17:12:03.651269 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.651246 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a55e3b09-195e-4c75-9243-b827d82be012-monitoring-plugin-cert\") pod \"monitoring-plugin-655d88fc6c-6fkl6\" (UID: \"a55e3b09-195e-4c75-9243-b827d82be012\") " pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6fkl6" May 06 17:12:03.694081 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.694055 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f66bf84b5-vgnd7"] May 06 17:12:03.696838 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.696341 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.706657 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.706637 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f66bf84b5-vgnd7"] May 06 17:12:03.772601 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.772530 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-667d5c986-57xjw"] May 06 17:12:03.778687 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:12:03.778663 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ddabe88_c951_48de_9c9d_61a43cde5935.slice/crio-5dec86d44addbe587b5841a9ed25496fc872c9922f7979e028c811e9a5348b4f WatchSource:0}: Error finding container 5dec86d44addbe587b5841a9ed25496fc872c9922f7979e028c811e9a5348b4f: Status 404 returned error can't find the container with id 5dec86d44addbe587b5841a9ed25496fc872c9922f7979e028c811e9a5348b4f May 06 17:12:03.779462 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.779442 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6fkl6" May 06 17:12:03.850249 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.849958 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-trusted-ca-bundle\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.850249 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.850052 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl5vb\" (UniqueName: \"kubernetes.io/projected/dfdecf74-1eda-4e09-95a9-48f3668705e9-kube-api-access-nl5vb\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.850249 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.850085 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-oauth-serving-cert\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.850249 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.850118 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-serving-cert\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.850249 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.850145 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-config\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.850249 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.850193 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-oauth-config\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.850669 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.850281 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-service-ca\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.919749 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.919727 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-655d88fc6c-6fkl6"] May 06 17:12:03.922532 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:12:03.922512 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda55e3b09_195e_4c75_9243_b827d82be012.slice/crio-a53687d636ee3f8850fce3aaf72fdc3b9b45b5473bdd47aed9ba7cf71e32e5a8 WatchSource:0}: Error finding container a53687d636ee3f8850fce3aaf72fdc3b9b45b5473bdd47aed9ba7cf71e32e5a8: Status 404 returned error can't find the container with id a53687d636ee3f8850fce3aaf72fdc3b9b45b5473bdd47aed9ba7cf71e32e5a8 May 06 17:12:03.928365 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.928251 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5b669d585c-7xqfk"] May 06 17:12:03.930498 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.930479 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:03.935892 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.934237 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-dfcck\"" May 06 17:12:03.935892 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.934386 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" May 06 17:12:03.935892 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.935049 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" May 06 17:12:03.940085 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.940065 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" May 06 17:12:03.940320 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.940302 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" May 06 17:12:03.941028 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.940568 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" May 06 17:12:03.944539 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.944516 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" May 06 17:12:03.944683 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.944662 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5b669d585c-7xqfk"] May 06 17:12:03.950851 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.950829 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-trusted-ca-bundle\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.950945 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.950900 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nl5vb\" (UniqueName: \"kubernetes.io/projected/dfdecf74-1eda-4e09-95a9-48f3668705e9-kube-api-access-nl5vb\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.950945 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.950930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-oauth-serving-cert\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.951050 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.950962 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-serving-cert\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.951050 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.950990 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-config\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.951050 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.951038 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-oauth-config\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.951190 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.951068 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-service-ca\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.951777 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.951757 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-service-ca\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.951900 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.951800 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-oauth-serving-cert\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.952488 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.952468 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-trusted-ca-bundle\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.952727 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.952706 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-config\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.954904 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.954882 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-serving-cert\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.955476 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.955454 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-oauth-config\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:03.961398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:03.961369 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl5vb\" (UniqueName: \"kubernetes.io/projected/dfdecf74-1eda-4e09-95a9-48f3668705e9-kube-api-access-nl5vb\") pod \"console-7f66bf84b5-vgnd7\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:04.009972 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.009948 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:04.051485 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.051451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0914baa9-2745-4b38-856b-bcc60beae5d4-secret-telemeter-client\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.051667 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.051494 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pshc\" (UniqueName: \"kubernetes.io/projected/0914baa9-2745-4b38-856b-bcc60beae5d4-kube-api-access-8pshc\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.051667 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.051530 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0914baa9-2745-4b38-856b-bcc60beae5d4-serving-certs-ca-bundle\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.051667 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.051561 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0914baa9-2745-4b38-856b-bcc60beae5d4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.051667 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.051622 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0914baa9-2745-4b38-856b-bcc60beae5d4-federate-client-tls\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.051667 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.051643 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0914baa9-2745-4b38-856b-bcc60beae5d4-metrics-client-ca\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.051916 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.051722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0914baa9-2745-4b38-856b-bcc60beae5d4-telemeter-client-tls\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.051916 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.051751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0914baa9-2745-4b38-856b-bcc60beae5d4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.135068 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.135045 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f66bf84b5-vgnd7"] May 06 17:12:04.136729 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:12:04.136701 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfdecf74_1eda_4e09_95a9_48f3668705e9.slice/crio-5320ebda39cad45786497cf6d43404d65769e298939e158318031ca023d3a81e WatchSource:0}: Error finding container 5320ebda39cad45786497cf6d43404d65769e298939e158318031ca023d3a81e: Status 404 returned error can't find the container with id 5320ebda39cad45786497cf6d43404d65769e298939e158318031ca023d3a81e May 06 17:12:04.153102 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.153081 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0914baa9-2745-4b38-856b-bcc60beae5d4-serving-certs-ca-bundle\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.153199 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.153111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0914baa9-2745-4b38-856b-bcc60beae5d4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.153199 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.153149 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0914baa9-2745-4b38-856b-bcc60beae5d4-federate-client-tls\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.153199 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.153174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0914baa9-2745-4b38-856b-bcc60beae5d4-metrics-client-ca\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.153362 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.153239 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0914baa9-2745-4b38-856b-bcc60beae5d4-telemeter-client-tls\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.153362 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.153299 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0914baa9-2745-4b38-856b-bcc60beae5d4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.153450 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.153365 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0914baa9-2745-4b38-856b-bcc60beae5d4-secret-telemeter-client\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.153450 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.153392 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pshc\" (UniqueName: \"kubernetes.io/projected/0914baa9-2745-4b38-856b-bcc60beae5d4-kube-api-access-8pshc\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.153876 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.153830 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0914baa9-2745-4b38-856b-bcc60beae5d4-serving-certs-ca-bundle\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.154632 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.153930 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0914baa9-2745-4b38-856b-bcc60beae5d4-metrics-client-ca\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.154632 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.154622 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0914baa9-2745-4b38-856b-bcc60beae5d4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.156037 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.156012 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0914baa9-2745-4b38-856b-bcc60beae5d4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.156131 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.156071 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0914baa9-2745-4b38-856b-bcc60beae5d4-secret-telemeter-client\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.156229 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.156197 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0914baa9-2745-4b38-856b-bcc60beae5d4-federate-client-tls\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.156553 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.156534 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0914baa9-2745-4b38-856b-bcc60beae5d4-telemeter-client-tls\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.162912 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.162892 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pshc\" (UniqueName: \"kubernetes.io/projected/0914baa9-2745-4b38-856b-bcc60beae5d4-kube-api-access-8pshc\") pod \"telemeter-client-5b669d585c-7xqfk\" (UID: \"0914baa9-2745-4b38-856b-bcc60beae5d4\") " pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.221577 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.221542 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d26ad6c1-538b-46c9-beb2-48279a7382cd","Type":"ContainerStarted","Data":"5ba73b8f19801529eceb1f2641d1e9c0d7772b8e33f57cd4bd1862e5f94e27b1"} May 06 17:12:04.222749 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.221616 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d26ad6c1-538b-46c9-beb2-48279a7382cd","Type":"ContainerStarted","Data":"9884bc1a70287c21885534c481672a05afbcd356b669076195c2c07c5a9fe3d0"} May 06 17:12:04.222749 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.221636 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d26ad6c1-538b-46c9-beb2-48279a7382cd","Type":"ContainerStarted","Data":"9e13ed8a03333545ccf61d405749e2ab38d60abea378b47fc223d91153d95062"} May 06 17:12:04.222749 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.221648 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d26ad6c1-538b-46c9-beb2-48279a7382cd","Type":"ContainerStarted","Data":"415eff7a432be22b5d7990a45b73a3afd69382e79e9be8b888eb0a9e1671b068"} May 06 17:12:04.222749 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.221661 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d26ad6c1-538b-46c9-beb2-48279a7382cd","Type":"ContainerStarted","Data":"35056857454ab1278d0fb82a6926d06d66b8dd1258527d9dc186e07e54585f08"} May 06 17:12:04.222749 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.222675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6fkl6" event={"ID":"a55e3b09-195e-4c75-9243-b827d82be012","Type":"ContainerStarted","Data":"a53687d636ee3f8850fce3aaf72fdc3b9b45b5473bdd47aed9ba7cf71e32e5a8"} May 06 17:12:04.223770 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.223744 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-667d5c986-57xjw" event={"ID":"5ddabe88-c951-48de-9c9d-61a43cde5935","Type":"ContainerStarted","Data":"5dec86d44addbe587b5841a9ed25496fc872c9922f7979e028c811e9a5348b4f"} May 06 17:12:04.224996 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.224977 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f66bf84b5-vgnd7" event={"ID":"dfdecf74-1eda-4e09-95a9-48f3668705e9","Type":"ContainerStarted","Data":"3da8fa7e38b8f1085b900a1fd47590d5d284eb91626267867a047bb1f296f3ac"} May 06 17:12:04.225078 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.225003 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f66bf84b5-vgnd7" event={"ID":"dfdecf74-1eda-4e09-95a9-48f3668705e9","Type":"ContainerStarted","Data":"5320ebda39cad45786497cf6d43404d65769e298939e158318031ca023d3a81e"} May 06 17:12:04.246184 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.246129 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" May 06 17:12:04.257535 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.257498 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f66bf84b5-vgnd7" podStartSLOduration=1.257485464 podStartE2EDuration="1.257485464s" podCreationTimestamp="2026-05-06 17:12:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:12:04.254701747 +0000 UTC m=+97.896099345" watchObservedRunningTime="2026-05-06 17:12:04.257485464 +0000 UTC m=+97.898883073" May 06 17:12:04.377005 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:04.376977 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5b669d585c-7xqfk"] May 06 17:12:04.379635 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:12:04.379611 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0914baa9_2745_4b38_856b_bcc60beae5d4.slice/crio-6cb911bf5d73693eed43708056f9f83c8b449cb57dbfe66751d354e666c803da WatchSource:0}: Error finding container 6cb911bf5d73693eed43708056f9f83c8b449cb57dbfe66751d354e666c803da: Status 404 returned error can't find the container with id 6cb911bf5d73693eed43708056f9f83c8b449cb57dbfe66751d354e666c803da May 06 17:12:05.090388 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.090349 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] May 06 17:12:05.093868 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.093845 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.100480 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.100416 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" May 06 17:12:05.100480 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.100465 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" May 06 17:12:05.101093 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.101071 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" May 06 17:12:05.101414 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.101395 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" May 06 17:12:05.101973 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.101858 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" May 06 17:12:05.101973 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.101865 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jl76z\"" May 06 17:12:05.102247 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.102231 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" May 06 17:12:05.102329 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.102267 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" May 06 17:12:05.102329 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.102231 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" May 06 17:12:05.102930 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.102911 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" May 06 17:12:05.104790 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.104762 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" May 06 17:12:05.104886 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.104851 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-ouaf4jb8r3lt\"" May 06 17:12:05.108345 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.108316 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" May 06 17:12:05.112916 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.112896 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" May 06 17:12:05.120049 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.120025 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] May 06 17:12:05.230221 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.230174 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" event={"ID":"0914baa9-2745-4b38-856b-bcc60beae5d4","Type":"ContainerStarted","Data":"6cb911bf5d73693eed43708056f9f83c8b449cb57dbfe66751d354e666c803da"} May 06 17:12:05.264097 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.263990 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264256 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264140 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/623dd84d-9fb6-4d35-bd9d-8202b9017be2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264256 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264179 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-config\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264256 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264201 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/623dd84d-9fb6-4d35-bd9d-8202b9017be2-config-out\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264256 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264224 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264256 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264241 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/623dd84d-9fb6-4d35-bd9d-8202b9017be2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264504 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264266 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264504 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264335 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264504 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264374 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264504 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264411 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264504 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264463 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264789 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264579 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264789 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264635 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-web-config\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264789 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264789 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264731 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l57ln\" (UniqueName: \"kubernetes.io/projected/623dd84d-9fb6-4d35-bd9d-8202b9017be2-kube-api-access-l57ln\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264964 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264964 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264825 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.264964 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.264853 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.366303 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.366303 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.366303 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366299 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.366556 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366328 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-web-config\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.366556 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366399 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.366556 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366424 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l57ln\" (UniqueName: \"kubernetes.io/projected/623dd84d-9fb6-4d35-bd9d-8202b9017be2-kube-api-access-l57ln\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.366556 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366528 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.366787 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366570 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.366787 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.366787 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366664 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.366787 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366732 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/623dd84d-9fb6-4d35-bd9d-8202b9017be2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.366787 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-config\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.367012 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/623dd84d-9fb6-4d35-bd9d-8202b9017be2-config-out\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.367012 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.366827 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.367411 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.367333 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.367615 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.367578 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.370260 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.370236 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/623dd84d-9fb6-4d35-bd9d-8202b9017be2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.370540 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.370519 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/623dd84d-9fb6-4d35-bd9d-8202b9017be2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.371308 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.371262 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.374232 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.371450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.374232 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.371519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.374232 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.371544 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.374232 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.371699 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.374232 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.372300 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.374232 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.373443 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.374232 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.374120 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/623dd84d-9fb6-4d35-bd9d-8202b9017be2-config-out\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.374641 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.374548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-web-config\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.375520 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.375498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/623dd84d-9fb6-4d35-bd9d-8202b9017be2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.376324 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.376299 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.376416 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.376380 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.376833 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.376814 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-config\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.376911 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.376896 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.377247 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.377202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.377335 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.377315 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/623dd84d-9fb6-4d35-bd9d-8202b9017be2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.378849 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.378828 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/623dd84d-9fb6-4d35-bd9d-8202b9017be2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.383274 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.383226 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l57ln\" (UniqueName: \"kubernetes.io/projected/623dd84d-9fb6-4d35-bd9d-8202b9017be2-kube-api-access-l57ln\") pod \"prometheus-k8s-0\" (UID: \"623dd84d-9fb6-4d35-bd9d-8202b9017be2\") " pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:05.407178 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:05.407150 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:06.268500 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:06.268234 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] May 06 17:12:06.272528 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:12:06.272492 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod623dd84d_9fb6_4d35_bd9d_8202b9017be2.slice/crio-06512ca8b021d7c40ed5ade816bf499b6d1e14fdf7f17cbc2a0a37b6f0306a94 WatchSource:0}: Error finding container 06512ca8b021d7c40ed5ade816bf499b6d1e14fdf7f17cbc2a0a37b6f0306a94: Status 404 returned error can't find the container with id 06512ca8b021d7c40ed5ade816bf499b6d1e14fdf7f17cbc2a0a37b6f0306a94 May 06 17:12:07.241801 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.241758 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-667d5c986-57xjw" event={"ID":"5ddabe88-c951-48de-9c9d-61a43cde5935","Type":"ContainerStarted","Data":"99e8d5c52eba6fb3548d18b6e2788b2f03df627c3e27c47d5b97de093a63e8be"} May 06 17:12:07.245671 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.245644 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6fkl6" event={"ID":"a55e3b09-195e-4c75-9243-b827d82be012","Type":"ContainerStarted","Data":"239286343edc08c5c433002a2938b65b961f00097494821f447f2e1db9de074f"} May 06 17:12:07.245860 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.245839 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6fkl6" May 06 17:12:07.248162 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.248134 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" event={"ID":"0914baa9-2745-4b38-856b-bcc60beae5d4","Type":"ContainerStarted","Data":"41a0f8c25f25607a9d00c5fe3ad4c25a86ad0af6d5c6a10e17faff6442a70d55"} May 06 17:12:07.248266 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.248165 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" event={"ID":"0914baa9-2745-4b38-856b-bcc60beae5d4","Type":"ContainerStarted","Data":"936554d865f874bd0f57d83a3cfe41ec8558b8ffe2db268b5548d6ec3d45c0d0"} May 06 17:12:07.248266 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.248178 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" event={"ID":"0914baa9-2745-4b38-856b-bcc60beae5d4","Type":"ContainerStarted","Data":"b942fbe53786b096aed3862b7d17f425461e5b1de798b6740298358d1c1e4c8d"} May 06 17:12:07.249445 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.249421 2578 generic.go:358] "Generic (PLEG): container finished" podID="623dd84d-9fb6-4d35-bd9d-8202b9017be2" containerID="e337e8f7e0f4b4e065a76059110911600fed93450a2593bcbf172b9a55e93d08" exitCode=0 May 06 17:12:07.249526 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.249453 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"623dd84d-9fb6-4d35-bd9d-8202b9017be2","Type":"ContainerDied","Data":"e337e8f7e0f4b4e065a76059110911600fed93450a2593bcbf172b9a55e93d08"} May 06 17:12:07.249526 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.249476 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"623dd84d-9fb6-4d35-bd9d-8202b9017be2","Type":"ContainerStarted","Data":"06512ca8b021d7c40ed5ade816bf499b6d1e14fdf7f17cbc2a0a37b6f0306a94"} May 06 17:12:07.251919 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.251900 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6fkl6" May 06 17:12:07.252663 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.252645 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d26ad6c1-538b-46c9-beb2-48279a7382cd","Type":"ContainerStarted","Data":"4960c3a6c82f9ad2391f3a3c14e734b82b7b374ef0fcc4bde108b44db45545e4"} May 06 17:12:07.280175 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.280137 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-667d5c986-57xjw" podStartSLOduration=1.948025807 podStartE2EDuration="4.280127874s" podCreationTimestamp="2026-05-06 17:12:03 +0000 UTC" firstStartedPulling="2026-05-06 17:12:03.780639353 +0000 UTC m=+97.422036930" lastFinishedPulling="2026-05-06 17:12:06.112741409 +0000 UTC m=+99.754138997" observedRunningTime="2026-05-06 17:12:07.279298447 +0000 UTC m=+100.920696044" watchObservedRunningTime="2026-05-06 17:12:07.280127874 +0000 UTC m=+100.921525482" May 06 17:12:07.350001 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.349960 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-655d88fc6c-6fkl6" podStartSLOduration=2.161415573 podStartE2EDuration="4.349947492s" podCreationTimestamp="2026-05-06 17:12:03 +0000 UTC" firstStartedPulling="2026-05-06 17:12:03.924512816 +0000 UTC m=+97.565910392" lastFinishedPulling="2026-05-06 17:12:06.113044726 +0000 UTC m=+99.754442311" observedRunningTime="2026-05-06 17:12:07.348760686 +0000 UTC m=+100.990158278" watchObservedRunningTime="2026-05-06 17:12:07.349947492 +0000 UTC m=+100.991345089" May 06 17:12:07.386188 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.386140 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.12760729 podStartE2EDuration="8.386124226s" podCreationTimestamp="2026-05-06 17:11:59 +0000 UTC" firstStartedPulling="2026-05-06 17:12:00.854503161 +0000 UTC m=+94.495900737" lastFinishedPulling="2026-05-06 17:12:06.113020083 +0000 UTC m=+99.754417673" observedRunningTime="2026-05-06 17:12:07.384292272 +0000 UTC m=+101.025689881" watchObservedRunningTime="2026-05-06 17:12:07.386124226 +0000 UTC m=+101.027521825" May 06 17:12:07.411296 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:07.409697 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5b669d585c-7xqfk" podStartSLOduration=2.6434568929999998 podStartE2EDuration="4.409679524s" podCreationTimestamp="2026-05-06 17:12:03 +0000 UTC" firstStartedPulling="2026-05-06 17:12:04.381619219 +0000 UTC m=+98.023016799" lastFinishedPulling="2026-05-06 17:12:06.147841839 +0000 UTC m=+99.789239430" observedRunningTime="2026-05-06 17:12:07.406151943 +0000 UTC m=+101.047549553" watchObservedRunningTime="2026-05-06 17:12:07.409679524 +0000 UTC m=+101.051077126" May 06 17:12:08.124108 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.124072 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-k855j" May 06 17:12:08.444399 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.444361 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f66bf84b5-vgnd7"] May 06 17:12:08.486794 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.486766 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5696456b54-gvcqb"] May 06 17:12:08.505277 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.505252 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.517928 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.517896 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5696456b54-gvcqb"] May 06 17:12:08.604904 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.604871 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-oauth-config\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.605079 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.604932 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-service-ca\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.605079 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.604988 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-trusted-ca-bundle\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.605169 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.605097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-serving-cert\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.605169 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.605140 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sj9g\" (UniqueName: \"kubernetes.io/projected/ce610b8b-bbda-4278-8ca1-36ddf712814b-kube-api-access-4sj9g\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.605251 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.605217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-config\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.605298 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.605268 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-oauth-serving-cert\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.706350 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.706267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-service-ca\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.706350 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.706313 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-trusted-ca-bundle\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.706633 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.706355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-serving-cert\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.706633 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.706383 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sj9g\" (UniqueName: \"kubernetes.io/projected/ce610b8b-bbda-4278-8ca1-36ddf712814b-kube-api-access-4sj9g\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.706633 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.706603 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-config\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.706787 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.706660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-oauth-serving-cert\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.706787 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.706734 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-oauth-config\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.707146 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.707116 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-service-ca\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.707331 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.707280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-trusted-ca-bundle\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.707331 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.707296 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-config\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.707491 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.707340 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-oauth-serving-cert\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.709470 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.709447 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-serving-cert\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.709547 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.709499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-oauth-config\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.716234 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.716214 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sj9g\" (UniqueName: \"kubernetes.io/projected/ce610b8b-bbda-4278-8ca1-36ddf712814b-kube-api-access-4sj9g\") pod \"console-5696456b54-gvcqb\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.818100 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.818070 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:08.980114 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:08.980091 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5696456b54-gvcqb"] May 06 17:12:08.981705 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:12:08.981675 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce610b8b_bbda_4278_8ca1_36ddf712814b.slice/crio-da9f3b625594125809e6bed31a94d104ead00715c31b2e2538a42e875a0256f0 WatchSource:0}: Error finding container da9f3b625594125809e6bed31a94d104ead00715c31b2e2538a42e875a0256f0: Status 404 returned error can't find the container with id da9f3b625594125809e6bed31a94d104ead00715c31b2e2538a42e875a0256f0 May 06 17:12:09.262798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:09.262719 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5696456b54-gvcqb" event={"ID":"ce610b8b-bbda-4278-8ca1-36ddf712814b","Type":"ContainerStarted","Data":"83091eb1f04569a5c4856ecb06c473510ce9f9208fdffaa4f5decd9d40303138"} May 06 17:12:09.262798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:09.262765 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5696456b54-gvcqb" event={"ID":"ce610b8b-bbda-4278-8ca1-36ddf712814b","Type":"ContainerStarted","Data":"da9f3b625594125809e6bed31a94d104ead00715c31b2e2538a42e875a0256f0"} May 06 17:12:09.287927 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:09.287866 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5696456b54-gvcqb" podStartSLOduration=1.287846907 podStartE2EDuration="1.287846907s" podCreationTimestamp="2026-05-06 17:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:12:09.287270381 +0000 UTC m=+102.928667980" watchObservedRunningTime="2026-05-06 17:12:09.287846907 +0000 UTC m=+102.929244506" May 06 17:12:11.270708 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:11.270673 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"623dd84d-9fb6-4d35-bd9d-8202b9017be2","Type":"ContainerStarted","Data":"0b740a1def700cb470c9c247e6e8f69ecf06852c5b5d4f2f32b44ffe22cefe16"} May 06 17:12:11.270708 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:11.270708 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"623dd84d-9fb6-4d35-bd9d-8202b9017be2","Type":"ContainerStarted","Data":"675319877d873a05f7f53fa9287dc29b15ff8f4a38aebfd3dbb8e7363aa75365"} May 06 17:12:13.280232 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:13.280191 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"623dd84d-9fb6-4d35-bd9d-8202b9017be2","Type":"ContainerStarted","Data":"396457f47d7d8ee43cdd10c1d489f288438b8307d9d73ee2485043b64c267408"} May 06 17:12:13.280232 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:13.280236 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"623dd84d-9fb6-4d35-bd9d-8202b9017be2","Type":"ContainerStarted","Data":"47b0e7a72034b890408b42ec739472b1c256d1ad7f7b34f6ae885a5f4156a36a"} May 06 17:12:13.280663 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:13.280250 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"623dd84d-9fb6-4d35-bd9d-8202b9017be2","Type":"ContainerStarted","Data":"200f40cf66db4ba4f62e7dfba0559350f81dca67cb7b497f69bc44552ba73808"} May 06 17:12:13.280663 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:13.280262 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"623dd84d-9fb6-4d35-bd9d-8202b9017be2","Type":"ContainerStarted","Data":"d1926519281a13b9c5d19be5c96c261cd13db8fe30a0516943d15305e8600d18"} May 06 17:12:14.010847 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:14.010806 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:15.407923 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:15.407869 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" May 06 17:12:18.818800 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:18.818758 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:18.818800 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:18.818806 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:18.823180 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:18.823159 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:18.844706 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:18.844659 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=8.532900538 podStartE2EDuration="13.84464516s" podCreationTimestamp="2026-05-06 17:12:05 +0000 UTC" firstStartedPulling="2026-05-06 17:12:07.250738841 +0000 UTC m=+100.892136423" lastFinishedPulling="2026-05-06 17:12:12.562483456 +0000 UTC m=+106.203881045" observedRunningTime="2026-05-06 17:12:13.332187265 +0000 UTC m=+106.973584863" watchObservedRunningTime="2026-05-06 17:12:18.84464516 +0000 UTC m=+112.486042767" May 06 17:12:19.301622 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:19.301579 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5696456b54-gvcqb" May 06 17:12:19.357204 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:19.357173 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7677d75f5b-t2pqp"] May 06 17:12:23.528345 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:23.528315 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:23.528345 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:23.528352 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:26.223014 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.222946 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f89d694d9-lhbr9" podUID="dcad3307-adba-45cf-a174-5561ce168948" containerName="console" containerID="cri-o://54aa260b3fef6862c5d0f5e88794cab69ee12f14c7c3e9d76bcff443d8229b5e" gracePeriod=15 May 06 17:12:26.464315 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.464293 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f89d694d9-lhbr9_dcad3307-adba-45cf-a174-5561ce168948/console/0.log" May 06 17:12:26.464411 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.464351 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:12:26.560225 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.560139 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-console-config\") pod \"dcad3307-adba-45cf-a174-5561ce168948\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " May 06 17:12:26.560225 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.560184 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-service-ca\") pod \"dcad3307-adba-45cf-a174-5561ce168948\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " May 06 17:12:26.560225 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.560214 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jbf5\" (UniqueName: \"kubernetes.io/projected/dcad3307-adba-45cf-a174-5561ce168948-kube-api-access-4jbf5\") pod \"dcad3307-adba-45cf-a174-5561ce168948\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " May 06 17:12:26.560495 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.560238 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcad3307-adba-45cf-a174-5561ce168948-console-serving-cert\") pod \"dcad3307-adba-45cf-a174-5561ce168948\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " May 06 17:12:26.560495 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.560282 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcad3307-adba-45cf-a174-5561ce168948-console-oauth-config\") pod \"dcad3307-adba-45cf-a174-5561ce168948\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " May 06 17:12:26.560495 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.560306 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-oauth-serving-cert\") pod \"dcad3307-adba-45cf-a174-5561ce168948\" (UID: \"dcad3307-adba-45cf-a174-5561ce168948\") " May 06 17:12:26.560692 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.560551 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-console-config" (OuterVolumeSpecName: "console-config") pod "dcad3307-adba-45cf-a174-5561ce168948" (UID: "dcad3307-adba-45cf-a174-5561ce168948"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:12:26.560692 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.560670 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-service-ca" (OuterVolumeSpecName: "service-ca") pod "dcad3307-adba-45cf-a174-5561ce168948" (UID: "dcad3307-adba-45cf-a174-5561ce168948"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:12:26.560802 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.560771 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dcad3307-adba-45cf-a174-5561ce168948" (UID: "dcad3307-adba-45cf-a174-5561ce168948"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:12:26.562598 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.562558 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcad3307-adba-45cf-a174-5561ce168948-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dcad3307-adba-45cf-a174-5561ce168948" (UID: "dcad3307-adba-45cf-a174-5561ce168948"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:12:26.562666 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.562622 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcad3307-adba-45cf-a174-5561ce168948-kube-api-access-4jbf5" (OuterVolumeSpecName: "kube-api-access-4jbf5") pod "dcad3307-adba-45cf-a174-5561ce168948" (UID: "dcad3307-adba-45cf-a174-5561ce168948"). InnerVolumeSpecName "kube-api-access-4jbf5". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:12:26.562666 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.562647 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcad3307-adba-45cf-a174-5561ce168948-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dcad3307-adba-45cf-a174-5561ce168948" (UID: "dcad3307-adba-45cf-a174-5561ce168948"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:12:26.661851 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.661811 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-console-config\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:26.661851 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.661843 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-service-ca\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:26.661851 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.661856 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4jbf5\" (UniqueName: \"kubernetes.io/projected/dcad3307-adba-45cf-a174-5561ce168948-kube-api-access-4jbf5\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:26.662083 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.661869 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcad3307-adba-45cf-a174-5561ce168948-console-serving-cert\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:26.662083 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.661881 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcad3307-adba-45cf-a174-5561ce168948-console-oauth-config\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:26.662083 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:26.661893 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcad3307-adba-45cf-a174-5561ce168948-oauth-serving-cert\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:27.327349 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:27.327311 2578 generic.go:358] "Generic (PLEG): container finished" podID="dcad3307-adba-45cf-a174-5561ce168948" containerID="54aa260b3fef6862c5d0f5e88794cab69ee12f14c7c3e9d76bcff443d8229b5e" exitCode=2 May 06 17:12:27.327862 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:27.327367 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f89d694d9-lhbr9" event={"ID":"dcad3307-adba-45cf-a174-5561ce168948","Type":"ContainerDied","Data":"54aa260b3fef6862c5d0f5e88794cab69ee12f14c7c3e9d76bcff443d8229b5e"} May 06 17:12:27.327862 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:27.327393 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f89d694d9-lhbr9" event={"ID":"dcad3307-adba-45cf-a174-5561ce168948","Type":"ContainerDied","Data":"2ba302c3ae88064afdca223ec5a8f5da59aa920c7b4b30955374f9cec169ef5e"} May 06 17:12:27.327862 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:27.327407 2578 scope.go:117] "RemoveContainer" containerID="54aa260b3fef6862c5d0f5e88794cab69ee12f14c7c3e9d76bcff443d8229b5e" May 06 17:12:27.327862 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:27.327403 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f89d694d9-lhbr9" May 06 17:12:27.335021 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:27.335003 2578 scope.go:117] "RemoveContainer" containerID="54aa260b3fef6862c5d0f5e88794cab69ee12f14c7c3e9d76bcff443d8229b5e" May 06 17:12:27.335273 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:12:27.335252 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54aa260b3fef6862c5d0f5e88794cab69ee12f14c7c3e9d76bcff443d8229b5e\": container with ID starting with 54aa260b3fef6862c5d0f5e88794cab69ee12f14c7c3e9d76bcff443d8229b5e not found: ID does not exist" containerID="54aa260b3fef6862c5d0f5e88794cab69ee12f14c7c3e9d76bcff443d8229b5e" May 06 17:12:27.335319 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:27.335281 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54aa260b3fef6862c5d0f5e88794cab69ee12f14c7c3e9d76bcff443d8229b5e"} err="failed to get container status \"54aa260b3fef6862c5d0f5e88794cab69ee12f14c7c3e9d76bcff443d8229b5e\": rpc error: code = NotFound desc = could not find container \"54aa260b3fef6862c5d0f5e88794cab69ee12f14c7c3e9d76bcff443d8229b5e\": container with ID starting with 54aa260b3fef6862c5d0f5e88794cab69ee12f14c7c3e9d76bcff443d8229b5e not found: ID does not exist" May 06 17:12:27.359463 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:27.359395 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f89d694d9-lhbr9"] May 06 17:12:27.361498 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:27.361476 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f89d694d9-lhbr9"] May 06 17:12:28.885226 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:28.885192 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcad3307-adba-45cf-a174-5561ce168948" path="/var/lib/kubelet/pods/dcad3307-adba-45cf-a174-5561ce168948/volumes" May 06 17:12:33.465926 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.465884 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7f66bf84b5-vgnd7" podUID="dfdecf74-1eda-4e09-95a9-48f3668705e9" containerName="console" containerID="cri-o://3da8fa7e38b8f1085b900a1fd47590d5d284eb91626267867a047bb1f296f3ac" gracePeriod=15 May 06 17:12:33.703077 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.703055 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f66bf84b5-vgnd7_dfdecf74-1eda-4e09-95a9-48f3668705e9/console/0.log" May 06 17:12:33.703177 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.703114 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:33.826643 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.826523 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl5vb\" (UniqueName: \"kubernetes.io/projected/dfdecf74-1eda-4e09-95a9-48f3668705e9-kube-api-access-nl5vb\") pod \"dfdecf74-1eda-4e09-95a9-48f3668705e9\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " May 06 17:12:33.826643 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.826606 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-config\") pod \"dfdecf74-1eda-4e09-95a9-48f3668705e9\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " May 06 17:12:33.826842 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.826739 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-oauth-config\") pod \"dfdecf74-1eda-4e09-95a9-48f3668705e9\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " May 06 17:12:33.826842 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.826790 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-service-ca\") pod \"dfdecf74-1eda-4e09-95a9-48f3668705e9\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " May 06 17:12:33.826842 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.826826 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-serving-cert\") pod \"dfdecf74-1eda-4e09-95a9-48f3668705e9\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " May 06 17:12:33.826977 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.826863 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-oauth-serving-cert\") pod \"dfdecf74-1eda-4e09-95a9-48f3668705e9\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " May 06 17:12:33.826977 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.826941 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-trusted-ca-bundle\") pod \"dfdecf74-1eda-4e09-95a9-48f3668705e9\" (UID: \"dfdecf74-1eda-4e09-95a9-48f3668705e9\") " May 06 17:12:33.827124 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.827024 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-config" (OuterVolumeSpecName: "console-config") pod "dfdecf74-1eda-4e09-95a9-48f3668705e9" (UID: "dfdecf74-1eda-4e09-95a9-48f3668705e9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:12:33.827224 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.827189 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-service-ca" (OuterVolumeSpecName: "service-ca") pod "dfdecf74-1eda-4e09-95a9-48f3668705e9" (UID: "dfdecf74-1eda-4e09-95a9-48f3668705e9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:12:33.827445 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.827246 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-config\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:33.827445 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.827309 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dfdecf74-1eda-4e09-95a9-48f3668705e9" (UID: "dfdecf74-1eda-4e09-95a9-48f3668705e9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:12:33.827445 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.827432 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dfdecf74-1eda-4e09-95a9-48f3668705e9" (UID: "dfdecf74-1eda-4e09-95a9-48f3668705e9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:12:33.828946 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.828924 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dfdecf74-1eda-4e09-95a9-48f3668705e9" (UID: "dfdecf74-1eda-4e09-95a9-48f3668705e9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:12:33.829026 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.828945 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfdecf74-1eda-4e09-95a9-48f3668705e9-kube-api-access-nl5vb" (OuterVolumeSpecName: "kube-api-access-nl5vb") pod "dfdecf74-1eda-4e09-95a9-48f3668705e9" (UID: "dfdecf74-1eda-4e09-95a9-48f3668705e9"). InnerVolumeSpecName "kube-api-access-nl5vb". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:12:33.829026 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.828982 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dfdecf74-1eda-4e09-95a9-48f3668705e9" (UID: "dfdecf74-1eda-4e09-95a9-48f3668705e9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:12:33.928194 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.928158 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-trusted-ca-bundle\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:33.928194 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.928187 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nl5vb\" (UniqueName: \"kubernetes.io/projected/dfdecf74-1eda-4e09-95a9-48f3668705e9-kube-api-access-nl5vb\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:33.928194 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.928202 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-oauth-config\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:33.928421 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.928214 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-service-ca\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:33.928421 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.928228 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfdecf74-1eda-4e09-95a9-48f3668705e9-console-serving-cert\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:33.928421 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:33.928241 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dfdecf74-1eda-4e09-95a9-48f3668705e9-oauth-serving-cert\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:34.349031 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:34.349004 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f66bf84b5-vgnd7_dfdecf74-1eda-4e09-95a9-48f3668705e9/console/0.log" May 06 17:12:34.349238 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:34.349043 2578 generic.go:358] "Generic (PLEG): container finished" podID="dfdecf74-1eda-4e09-95a9-48f3668705e9" containerID="3da8fa7e38b8f1085b900a1fd47590d5d284eb91626267867a047bb1f296f3ac" exitCode=2 May 06 17:12:34.349238 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:34.349072 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f66bf84b5-vgnd7" event={"ID":"dfdecf74-1eda-4e09-95a9-48f3668705e9","Type":"ContainerDied","Data":"3da8fa7e38b8f1085b900a1fd47590d5d284eb91626267867a047bb1f296f3ac"} May 06 17:12:34.349238 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:34.349116 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f66bf84b5-vgnd7" event={"ID":"dfdecf74-1eda-4e09-95a9-48f3668705e9","Type":"ContainerDied","Data":"5320ebda39cad45786497cf6d43404d65769e298939e158318031ca023d3a81e"} May 06 17:12:34.349238 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:34.349120 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f66bf84b5-vgnd7" May 06 17:12:34.349238 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:34.349135 2578 scope.go:117] "RemoveContainer" containerID="3da8fa7e38b8f1085b900a1fd47590d5d284eb91626267867a047bb1f296f3ac" May 06 17:12:34.359257 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:34.359226 2578 scope.go:117] "RemoveContainer" containerID="3da8fa7e38b8f1085b900a1fd47590d5d284eb91626267867a047bb1f296f3ac" May 06 17:12:34.359640 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:12:34.359622 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da8fa7e38b8f1085b900a1fd47590d5d284eb91626267867a047bb1f296f3ac\": container with ID starting with 3da8fa7e38b8f1085b900a1fd47590d5d284eb91626267867a047bb1f296f3ac not found: ID does not exist" containerID="3da8fa7e38b8f1085b900a1fd47590d5d284eb91626267867a047bb1f296f3ac" May 06 17:12:34.359714 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:34.359647 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da8fa7e38b8f1085b900a1fd47590d5d284eb91626267867a047bb1f296f3ac"} err="failed to get container status \"3da8fa7e38b8f1085b900a1fd47590d5d284eb91626267867a047bb1f296f3ac\": rpc error: code = NotFound desc = could not find container \"3da8fa7e38b8f1085b900a1fd47590d5d284eb91626267867a047bb1f296f3ac\": container with ID starting with 3da8fa7e38b8f1085b900a1fd47590d5d284eb91626267867a047bb1f296f3ac not found: ID does not exist" May 06 17:12:34.372900 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:34.372878 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f66bf84b5-vgnd7"] May 06 17:12:34.377022 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:34.377004 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7f66bf84b5-vgnd7"] May 06 17:12:34.884503 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:34.884471 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfdecf74-1eda-4e09-95a9-48f3668705e9" path="/var/lib/kubelet/pods/dfdecf74-1eda-4e09-95a9-48f3668705e9/volumes" May 06 17:12:43.533603 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:43.533558 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:43.537388 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:43.537364 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-667d5c986-57xjw" May 06 17:12:44.376729 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.376693 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7677d75f5b-t2pqp" podUID="2f494aed-09f3-4cc4-9eca-7c79c5fd611b" containerName="console" containerID="cri-o://c34f7e52f10f566a7a8a2891216d076345ce7ea405f21c8f6f2efa76eb76355d" gracePeriod=15 May 06 17:12:44.615515 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.615493 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7677d75f5b-t2pqp_2f494aed-09f3-4cc4-9eca-7c79c5fd611b/console/0.log" May 06 17:12:44.615844 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.615554 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:12:44.729515 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.729468 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-oauth-serving-cert\") pod \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " May 06 17:12:44.729719 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.729562 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z4jv\" (UniqueName: \"kubernetes.io/projected/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-kube-api-access-8z4jv\") pod \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " May 06 17:12:44.729719 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.729610 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-config\") pod \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " May 06 17:12:44.729719 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.729632 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-serving-cert\") pod \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " May 06 17:12:44.729719 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.729648 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-trusted-ca-bundle\") pod \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " May 06 17:12:44.729719 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.729699 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-service-ca\") pod \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " May 06 17:12:44.729719 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.729718 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-oauth-config\") pod \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\" (UID: \"2f494aed-09f3-4cc4-9eca-7c79c5fd611b\") " May 06 17:12:44.730023 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.729975 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2f494aed-09f3-4cc4-9eca-7c79c5fd611b" (UID: "2f494aed-09f3-4cc4-9eca-7c79c5fd611b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:12:44.730124 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.730097 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-config" (OuterVolumeSpecName: "console-config") pod "2f494aed-09f3-4cc4-9eca-7c79c5fd611b" (UID: "2f494aed-09f3-4cc4-9eca-7c79c5fd611b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:12:44.730124 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.730113 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-service-ca" (OuterVolumeSpecName: "service-ca") pod "2f494aed-09f3-4cc4-9eca-7c79c5fd611b" (UID: "2f494aed-09f3-4cc4-9eca-7c79c5fd611b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:12:44.730225 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.730159 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2f494aed-09f3-4cc4-9eca-7c79c5fd611b" (UID: "2f494aed-09f3-4cc4-9eca-7c79c5fd611b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:12:44.731798 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.731772 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2f494aed-09f3-4cc4-9eca-7c79c5fd611b" (UID: "2f494aed-09f3-4cc4-9eca-7c79c5fd611b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:12:44.731898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.731834 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-kube-api-access-8z4jv" (OuterVolumeSpecName: "kube-api-access-8z4jv") pod "2f494aed-09f3-4cc4-9eca-7c79c5fd611b" (UID: "2f494aed-09f3-4cc4-9eca-7c79c5fd611b"). InnerVolumeSpecName "kube-api-access-8z4jv". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:12:44.731898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.731853 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2f494aed-09f3-4cc4-9eca-7c79c5fd611b" (UID: "2f494aed-09f3-4cc4-9eca-7c79c5fd611b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:12:44.830713 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.830683 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8z4jv\" (UniqueName: \"kubernetes.io/projected/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-kube-api-access-8z4jv\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:44.830713 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.830708 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-config\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:44.830713 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.830717 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-serving-cert\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:44.830923 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.830727 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-trusted-ca-bundle\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:44.830923 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.830736 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-service-ca\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:44.830923 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.830745 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-console-oauth-config\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:44.830923 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:44.830754 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f494aed-09f3-4cc4-9eca-7c79c5fd611b-oauth-serving-cert\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:12:45.382801 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:45.382771 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7677d75f5b-t2pqp_2f494aed-09f3-4cc4-9eca-7c79c5fd611b/console/0.log" May 06 17:12:45.382963 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:45.382810 2578 generic.go:358] "Generic (PLEG): container finished" podID="2f494aed-09f3-4cc4-9eca-7c79c5fd611b" containerID="c34f7e52f10f566a7a8a2891216d076345ce7ea405f21c8f6f2efa76eb76355d" exitCode=2 May 06 17:12:45.382963 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:45.382897 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7677d75f5b-t2pqp" event={"ID":"2f494aed-09f3-4cc4-9eca-7c79c5fd611b","Type":"ContainerDied","Data":"c34f7e52f10f566a7a8a2891216d076345ce7ea405f21c8f6f2efa76eb76355d"} May 06 17:12:45.382963 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:45.382939 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7677d75f5b-t2pqp" event={"ID":"2f494aed-09f3-4cc4-9eca-7c79c5fd611b","Type":"ContainerDied","Data":"722c05b218f96c49d4eb7e8ebfeacd2f3f1b34d205421442a5129b8c23dc6e6d"} May 06 17:12:45.382963 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:45.382953 2578 scope.go:117] "RemoveContainer" containerID="c34f7e52f10f566a7a8a2891216d076345ce7ea405f21c8f6f2efa76eb76355d" May 06 17:12:45.383138 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:45.382905 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7677d75f5b-t2pqp" May 06 17:12:45.390626 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:45.390609 2578 scope.go:117] "RemoveContainer" containerID="c34f7e52f10f566a7a8a2891216d076345ce7ea405f21c8f6f2efa76eb76355d" May 06 17:12:45.390863 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:12:45.390845 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c34f7e52f10f566a7a8a2891216d076345ce7ea405f21c8f6f2efa76eb76355d\": container with ID starting with c34f7e52f10f566a7a8a2891216d076345ce7ea405f21c8f6f2efa76eb76355d not found: ID does not exist" containerID="c34f7e52f10f566a7a8a2891216d076345ce7ea405f21c8f6f2efa76eb76355d" May 06 17:12:45.390925 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:45.390870 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34f7e52f10f566a7a8a2891216d076345ce7ea405f21c8f6f2efa76eb76355d"} err="failed to get container status \"c34f7e52f10f566a7a8a2891216d076345ce7ea405f21c8f6f2efa76eb76355d\": rpc error: code = NotFound desc = could not find container \"c34f7e52f10f566a7a8a2891216d076345ce7ea405f21c8f6f2efa76eb76355d\": container with ID starting with c34f7e52f10f566a7a8a2891216d076345ce7ea405f21c8f6f2efa76eb76355d not found: ID does not exist" May 06 17:12:45.407381 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:45.407354 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7677d75f5b-t2pqp"] May 06 17:12:45.413299 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:45.413280 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7677d75f5b-t2pqp"] May 06 17:12:46.884364 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:12:46.884333 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f494aed-09f3-4cc4-9eca-7c79c5fd611b" path="/var/lib/kubelet/pods/2f494aed-09f3-4cc4-9eca-7c79c5fd611b/volumes" May 06 17:13:05.407971 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:05.407939 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" May 06 17:13:05.426613 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:05.426576 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" May 06 17:13:05.453622 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:05.453579 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" May 06 17:13:21.251886 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.251809 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d47b68768-6v9ml"] May 06 17:13:21.252302 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.252153 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfdecf74-1eda-4e09-95a9-48f3668705e9" containerName="console" May 06 17:13:21.252302 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.252167 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdecf74-1eda-4e09-95a9-48f3668705e9" containerName="console" May 06 17:13:21.252302 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.252181 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f494aed-09f3-4cc4-9eca-7c79c5fd611b" containerName="console" May 06 17:13:21.252302 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.252189 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f494aed-09f3-4cc4-9eca-7c79c5fd611b" containerName="console" May 06 17:13:21.252302 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.252206 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcad3307-adba-45cf-a174-5561ce168948" containerName="console" May 06 17:13:21.252302 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.252213 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcad3307-adba-45cf-a174-5561ce168948" containerName="console" May 06 17:13:21.252302 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.252264 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f494aed-09f3-4cc4-9eca-7c79c5fd611b" containerName="console" May 06 17:13:21.252302 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.252276 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="dcad3307-adba-45cf-a174-5561ce168948" containerName="console" May 06 17:13:21.252302 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.252285 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfdecf74-1eda-4e09-95a9-48f3668705e9" containerName="console" May 06 17:13:21.254036 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.254020 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.269871 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.269844 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d47b68768-6v9ml"] May 06 17:13:21.334382 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.334359 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wsbt\" (UniqueName: \"kubernetes.io/projected/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-kube-api-access-4wsbt\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.334524 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.334401 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-oauth-config\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.334524 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.334426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-service-ca\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.334524 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.334449 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-trusted-ca-bundle\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.334524 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.334471 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-serving-cert\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.334524 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.334497 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-oauth-serving-cert\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.334707 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.334522 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-config\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.435860 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.435822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wsbt\" (UniqueName: \"kubernetes.io/projected/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-kube-api-access-4wsbt\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.436006 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.435885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-oauth-config\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.436074 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.436054 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-service-ca\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.436110 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.436087 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-trusted-ca-bundle\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.436154 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.436115 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-serving-cert\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.436200 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.436162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-oauth-serving-cert\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.436200 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.436188 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-config\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.436708 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.436680 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-service-ca\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.436976 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.436950 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-oauth-serving-cert\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.437025 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.436969 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-config\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.437066 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.437032 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-trusted-ca-bundle\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.438243 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.438225 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-oauth-config\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.438631 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.438613 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-serving-cert\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.445277 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.445242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wsbt\" (UniqueName: \"kubernetes.io/projected/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-kube-api-access-4wsbt\") pod \"console-5d47b68768-6v9ml\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.562436 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.562359 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:21.687079 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:21.687057 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d47b68768-6v9ml"] May 06 17:13:21.689744 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:13:21.689714 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode84f004d_21a8_4850_8cbf_e27ddaa4ec19.slice/crio-14d6b0afa7f6152c34ea055130e2817fbff2df8064e0b15d0d7ebc9f66259f90 WatchSource:0}: Error finding container 14d6b0afa7f6152c34ea055130e2817fbff2df8064e0b15d0d7ebc9f66259f90: Status 404 returned error can't find the container with id 14d6b0afa7f6152c34ea055130e2817fbff2df8064e0b15d0d7ebc9f66259f90 May 06 17:13:22.489184 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:22.489142 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d47b68768-6v9ml" event={"ID":"e84f004d-21a8-4850-8cbf-e27ddaa4ec19","Type":"ContainerStarted","Data":"2c9ae22564c2adb3afb7b279cb2cf05c6fb5d8ddf082301b9cad39e5a2607eb7"} May 06 17:13:22.489184 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:22.489184 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d47b68768-6v9ml" event={"ID":"e84f004d-21a8-4850-8cbf-e27ddaa4ec19","Type":"ContainerStarted","Data":"14d6b0afa7f6152c34ea055130e2817fbff2df8064e0b15d0d7ebc9f66259f90"} May 06 17:13:22.513082 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:22.513034 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d47b68768-6v9ml" podStartSLOduration=1.513020197 podStartE2EDuration="1.513020197s" podCreationTimestamp="2026-05-06 17:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:13:22.511896058 +0000 UTC m=+176.153293668" watchObservedRunningTime="2026-05-06 17:13:22.513020197 +0000 UTC m=+176.154417795" May 06 17:13:31.563400 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:31.563367 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:31.563843 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:31.563440 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:31.568035 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:31.568013 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:32.522112 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:32.522086 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:13:32.574705 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:32.574671 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5696456b54-gvcqb"] May 06 17:13:57.593555 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.593498 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5696456b54-gvcqb" podUID="ce610b8b-bbda-4278-8ca1-36ddf712814b" containerName="console" containerID="cri-o://83091eb1f04569a5c4856ecb06c473510ce9f9208fdffaa4f5decd9d40303138" gracePeriod=15 May 06 17:13:57.828705 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.828678 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5696456b54-gvcqb_ce610b8b-bbda-4278-8ca1-36ddf712814b/console/0.log" May 06 17:13:57.828825 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.828748 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5696456b54-gvcqb" May 06 17:13:57.926675 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.926651 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-oauth-serving-cert\") pod \"ce610b8b-bbda-4278-8ca1-36ddf712814b\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " May 06 17:13:57.926888 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.926718 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-serving-cert\") pod \"ce610b8b-bbda-4278-8ca1-36ddf712814b\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " May 06 17:13:57.926928 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.926889 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-trusted-ca-bundle\") pod \"ce610b8b-bbda-4278-8ca1-36ddf712814b\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " May 06 17:13:57.926962 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.926940 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-oauth-config\") pod \"ce610b8b-bbda-4278-8ca1-36ddf712814b\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " May 06 17:13:57.926999 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.926962 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sj9g\" (UniqueName: \"kubernetes.io/projected/ce610b8b-bbda-4278-8ca1-36ddf712814b-kube-api-access-4sj9g\") pod \"ce610b8b-bbda-4278-8ca1-36ddf712814b\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " May 06 17:13:57.926999 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.926992 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-config\") pod \"ce610b8b-bbda-4278-8ca1-36ddf712814b\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " May 06 17:13:57.927098 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.927020 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-service-ca\") pod \"ce610b8b-bbda-4278-8ca1-36ddf712814b\" (UID: \"ce610b8b-bbda-4278-8ca1-36ddf712814b\") " May 06 17:13:57.927144 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.927115 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ce610b8b-bbda-4278-8ca1-36ddf712814b" (UID: "ce610b8b-bbda-4278-8ca1-36ddf712814b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:13:57.927324 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.927306 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ce610b8b-bbda-4278-8ca1-36ddf712814b" (UID: "ce610b8b-bbda-4278-8ca1-36ddf712814b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:13:57.927397 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.927315 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-oauth-serving-cert\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:13:57.927397 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.927355 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-config" (OuterVolumeSpecName: "console-config") pod "ce610b8b-bbda-4278-8ca1-36ddf712814b" (UID: "ce610b8b-bbda-4278-8ca1-36ddf712814b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:13:57.927501 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.927421 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-service-ca" (OuterVolumeSpecName: "service-ca") pod "ce610b8b-bbda-4278-8ca1-36ddf712814b" (UID: "ce610b8b-bbda-4278-8ca1-36ddf712814b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:13:57.928901 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.928879 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ce610b8b-bbda-4278-8ca1-36ddf712814b" (UID: "ce610b8b-bbda-4278-8ca1-36ddf712814b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:13:57.929023 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.928996 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ce610b8b-bbda-4278-8ca1-36ddf712814b" (UID: "ce610b8b-bbda-4278-8ca1-36ddf712814b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:13:57.929070 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:57.929013 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce610b8b-bbda-4278-8ca1-36ddf712814b-kube-api-access-4sj9g" (OuterVolumeSpecName: "kube-api-access-4sj9g") pod "ce610b8b-bbda-4278-8ca1-36ddf712814b" (UID: "ce610b8b-bbda-4278-8ca1-36ddf712814b"). InnerVolumeSpecName "kube-api-access-4sj9g". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:13:58.027900 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.027843 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-serving-cert\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:13:58.027900 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.027890 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-trusted-ca-bundle\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:13:58.027900 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.027904 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-oauth-config\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:13:58.027900 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.027917 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4sj9g\" (UniqueName: \"kubernetes.io/projected/ce610b8b-bbda-4278-8ca1-36ddf712814b-kube-api-access-4sj9g\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:13:58.028174 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.027931 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-console-config\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:13:58.028174 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.027943 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce610b8b-bbda-4278-8ca1-36ddf712814b-service-ca\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:13:58.595466 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.595437 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5696456b54-gvcqb_ce610b8b-bbda-4278-8ca1-36ddf712814b/console/0.log" May 06 17:13:58.595941 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.595480 2578 generic.go:358] "Generic (PLEG): container finished" podID="ce610b8b-bbda-4278-8ca1-36ddf712814b" containerID="83091eb1f04569a5c4856ecb06c473510ce9f9208fdffaa4f5decd9d40303138" exitCode=2 May 06 17:13:58.595941 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.595540 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5696456b54-gvcqb" event={"ID":"ce610b8b-bbda-4278-8ca1-36ddf712814b","Type":"ContainerDied","Data":"83091eb1f04569a5c4856ecb06c473510ce9f9208fdffaa4f5decd9d40303138"} May 06 17:13:58.595941 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.595551 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5696456b54-gvcqb" May 06 17:13:58.595941 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.595564 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5696456b54-gvcqb" event={"ID":"ce610b8b-bbda-4278-8ca1-36ddf712814b","Type":"ContainerDied","Data":"da9f3b625594125809e6bed31a94d104ead00715c31b2e2538a42e875a0256f0"} May 06 17:13:58.595941 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.595604 2578 scope.go:117] "RemoveContainer" containerID="83091eb1f04569a5c4856ecb06c473510ce9f9208fdffaa4f5decd9d40303138" May 06 17:13:58.603618 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.603597 2578 scope.go:117] "RemoveContainer" containerID="83091eb1f04569a5c4856ecb06c473510ce9f9208fdffaa4f5decd9d40303138" May 06 17:13:58.603888 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:13:58.603865 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83091eb1f04569a5c4856ecb06c473510ce9f9208fdffaa4f5decd9d40303138\": container with ID starting with 83091eb1f04569a5c4856ecb06c473510ce9f9208fdffaa4f5decd9d40303138 not found: ID does not exist" containerID="83091eb1f04569a5c4856ecb06c473510ce9f9208fdffaa4f5decd9d40303138" May 06 17:13:58.603948 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.603895 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83091eb1f04569a5c4856ecb06c473510ce9f9208fdffaa4f5decd9d40303138"} err="failed to get container status \"83091eb1f04569a5c4856ecb06c473510ce9f9208fdffaa4f5decd9d40303138\": rpc error: code = NotFound desc = could not find container \"83091eb1f04569a5c4856ecb06c473510ce9f9208fdffaa4f5decd9d40303138\": container with ID starting with 83091eb1f04569a5c4856ecb06c473510ce9f9208fdffaa4f5decd9d40303138 not found: ID does not exist" May 06 17:13:58.616872 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.616849 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5696456b54-gvcqb"] May 06 17:13:58.619089 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.619064 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5696456b54-gvcqb"] May 06 17:13:58.883868 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:13:58.883783 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce610b8b-bbda-4278-8ca1-36ddf712814b" path="/var/lib/kubelet/pods/ce610b8b-bbda-4278-8ca1-36ddf712814b/volumes" May 06 17:15:11.670346 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.670309 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2"] May 06 17:15:11.670819 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.670618 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce610b8b-bbda-4278-8ca1-36ddf712814b" containerName="console" May 06 17:15:11.670819 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.670629 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce610b8b-bbda-4278-8ca1-36ddf712814b" containerName="console" May 06 17:15:11.670819 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.670682 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce610b8b-bbda-4278-8ca1-36ddf712814b" containerName="console" May 06 17:15:11.673402 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.673387 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2" May 06 17:15:11.676837 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.676810 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" May 06 17:15:11.677865 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.677848 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" May 06 17:15:11.677958 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.677864 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-q966n\"" May 06 17:15:11.688567 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.688545 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2"] May 06 17:15:11.813408 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.813376 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z525w\" (UniqueName: \"kubernetes.io/projected/4f1c1b72-2fde-4d8c-940e-ef9ad420264e-kube-api-access-z525w\") pod \"openshift-lws-operator-bfc7f696d-fmtz2\" (UID: \"4f1c1b72-2fde-4d8c-940e-ef9ad420264e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2" May 06 17:15:11.813567 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.813426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f1c1b72-2fde-4d8c-940e-ef9ad420264e-tmp\") pod \"openshift-lws-operator-bfc7f696d-fmtz2\" (UID: \"4f1c1b72-2fde-4d8c-940e-ef9ad420264e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2" May 06 17:15:11.913879 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.913853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z525w\" (UniqueName: \"kubernetes.io/projected/4f1c1b72-2fde-4d8c-940e-ef9ad420264e-kube-api-access-z525w\") pod \"openshift-lws-operator-bfc7f696d-fmtz2\" (UID: \"4f1c1b72-2fde-4d8c-940e-ef9ad420264e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2" May 06 17:15:11.914013 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.913902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f1c1b72-2fde-4d8c-940e-ef9ad420264e-tmp\") pod \"openshift-lws-operator-bfc7f696d-fmtz2\" (UID: \"4f1c1b72-2fde-4d8c-940e-ef9ad420264e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2" May 06 17:15:11.914234 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.914218 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f1c1b72-2fde-4d8c-940e-ef9ad420264e-tmp\") pod \"openshift-lws-operator-bfc7f696d-fmtz2\" (UID: \"4f1c1b72-2fde-4d8c-940e-ef9ad420264e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2" May 06 17:15:11.922396 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.922374 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z525w\" (UniqueName: \"kubernetes.io/projected/4f1c1b72-2fde-4d8c-940e-ef9ad420264e-kube-api-access-z525w\") pod \"openshift-lws-operator-bfc7f696d-fmtz2\" (UID: \"4f1c1b72-2fde-4d8c-940e-ef9ad420264e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2" May 06 17:15:11.991176 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:11.991152 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2" May 06 17:15:12.117350 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:12.113359 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2"] May 06 17:15:12.119133 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:15:12.119104 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f1c1b72_2fde_4d8c_940e_ef9ad420264e.slice/crio-2c78545d19d47b29578894b16d673cfce696075f7b64c692ac04128c715f3b6e WatchSource:0}: Error finding container 2c78545d19d47b29578894b16d673cfce696075f7b64c692ac04128c715f3b6e: Status 404 returned error can't find the container with id 2c78545d19d47b29578894b16d673cfce696075f7b64c692ac04128c715f3b6e May 06 17:15:12.801644 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:12.801611 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2" event={"ID":"4f1c1b72-2fde-4d8c-940e-ef9ad420264e","Type":"ContainerStarted","Data":"2c78545d19d47b29578894b16d673cfce696075f7b64c692ac04128c715f3b6e"} May 06 17:15:15.812324 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:15.812292 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2" event={"ID":"4f1c1b72-2fde-4d8c-940e-ef9ad420264e","Type":"ContainerStarted","Data":"8e67f4cfc634782fe576f4a9caf1c9bc8f3187b58764941d653dd899dc5fd43a"} May 06 17:15:15.829888 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:15.829841 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fmtz2" podStartSLOduration=1.5633368330000001 podStartE2EDuration="4.829827816s" podCreationTimestamp="2026-05-06 17:15:11 +0000 UTC" firstStartedPulling="2026-05-06 17:15:12.120854905 +0000 UTC m=+285.762252484" lastFinishedPulling="2026-05-06 17:15:15.387345876 +0000 UTC m=+289.028743467" observedRunningTime="2026-05-06 17:15:15.828653794 +0000 UTC m=+289.470051391" watchObservedRunningTime="2026-05-06 17:15:15.829827816 +0000 UTC m=+289.471225414" May 06 17:15:18.281714 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.281682 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-ttxwp"] May 06 17:15:18.285111 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.285096 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-ttxwp" May 06 17:15:18.287875 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.287855 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" May 06 17:15:18.287980 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.287857 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-ck648\"" May 06 17:15:18.289011 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.288998 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" May 06 17:15:18.297500 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.297479 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-ttxwp"] May 06 17:15:18.362970 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.362944 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7dd46b4-01d1-4b73-9a81-ddc6c3792573-bound-sa-token\") pod \"cert-manager-79c8d999ff-ttxwp\" (UID: \"d7dd46b4-01d1-4b73-9a81-ddc6c3792573\") " pod="cert-manager/cert-manager-79c8d999ff-ttxwp" May 06 17:15:18.363063 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.362973 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgq9t\" (UniqueName: \"kubernetes.io/projected/d7dd46b4-01d1-4b73-9a81-ddc6c3792573-kube-api-access-cgq9t\") pod \"cert-manager-79c8d999ff-ttxwp\" (UID: \"d7dd46b4-01d1-4b73-9a81-ddc6c3792573\") " pod="cert-manager/cert-manager-79c8d999ff-ttxwp" May 06 17:15:18.464296 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.464268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7dd46b4-01d1-4b73-9a81-ddc6c3792573-bound-sa-token\") pod \"cert-manager-79c8d999ff-ttxwp\" (UID: \"d7dd46b4-01d1-4b73-9a81-ddc6c3792573\") " pod="cert-manager/cert-manager-79c8d999ff-ttxwp" May 06 17:15:18.464396 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.464297 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgq9t\" (UniqueName: \"kubernetes.io/projected/d7dd46b4-01d1-4b73-9a81-ddc6c3792573-kube-api-access-cgq9t\") pod \"cert-manager-79c8d999ff-ttxwp\" (UID: \"d7dd46b4-01d1-4b73-9a81-ddc6c3792573\") " pod="cert-manager/cert-manager-79c8d999ff-ttxwp" May 06 17:15:18.473398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.473357 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7dd46b4-01d1-4b73-9a81-ddc6c3792573-bound-sa-token\") pod \"cert-manager-79c8d999ff-ttxwp\" (UID: \"d7dd46b4-01d1-4b73-9a81-ddc6c3792573\") " pod="cert-manager/cert-manager-79c8d999ff-ttxwp" May 06 17:15:18.473517 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.473413 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgq9t\" (UniqueName: \"kubernetes.io/projected/d7dd46b4-01d1-4b73-9a81-ddc6c3792573-kube-api-access-cgq9t\") pod \"cert-manager-79c8d999ff-ttxwp\" (UID: \"d7dd46b4-01d1-4b73-9a81-ddc6c3792573\") " pod="cert-manager/cert-manager-79c8d999ff-ttxwp" May 06 17:15:18.594508 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.594419 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-ttxwp" May 06 17:15:18.717826 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.717748 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-ttxwp"] May 06 17:15:18.720038 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:15:18.720010 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7dd46b4_01d1_4b73_9a81_ddc6c3792573.slice/crio-52bb54310f92b7ac2cfa1e10ce12e1a06c63b61540c3c97be1ea0cda6647897e WatchSource:0}: Error finding container 52bb54310f92b7ac2cfa1e10ce12e1a06c63b61540c3c97be1ea0cda6647897e: Status 404 returned error can't find the container with id 52bb54310f92b7ac2cfa1e10ce12e1a06c63b61540c3c97be1ea0cda6647897e May 06 17:15:18.821183 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:18.821149 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-ttxwp" event={"ID":"d7dd46b4-01d1-4b73-9a81-ddc6c3792573","Type":"ContainerStarted","Data":"52bb54310f92b7ac2cfa1e10ce12e1a06c63b61540c3c97be1ea0cda6647897e"} May 06 17:15:21.832042 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:21.831992 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-ttxwp" event={"ID":"d7dd46b4-01d1-4b73-9a81-ddc6c3792573","Type":"ContainerStarted","Data":"59ff3cffe91034a16019eadda9afc4d10082f802917304cdfaebde7f0faf1eb5"} May 06 17:15:21.849683 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:21.849634 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-ttxwp" podStartSLOduration=0.867479015 podStartE2EDuration="3.849619457s" podCreationTimestamp="2026-05-06 17:15:18 +0000 UTC" firstStartedPulling="2026-05-06 17:15:18.721884801 +0000 UTC m=+292.363282377" lastFinishedPulling="2026-05-06 17:15:21.704025229 +0000 UTC m=+295.345422819" observedRunningTime="2026-05-06 17:15:21.848449143 +0000 UTC m=+295.489846766" watchObservedRunningTime="2026-05-06 17:15:21.849619457 +0000 UTC m=+295.491017054" May 06 17:15:24.550151 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.550113 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r"] May 06 17:15:24.553472 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.553452 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.557473 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.557451 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" May 06 17:15:24.557600 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.557470 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" May 06 17:15:24.557600 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.557534 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-b99g6\"" May 06 17:15:24.557600 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.557558 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" May 06 17:15:24.561439 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.561418 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r"] May 06 17:15:24.617911 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.617882 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w65ww\" (UniqueName: \"kubernetes.io/projected/fc5dde84-ff65-4a2b-8fa2-291004ae61c7-kube-api-access-w65ww\") pod \"lws-controller-manager-7fd7474bdd-99k2r\" (UID: \"fc5dde84-ff65-4a2b-8fa2-291004ae61c7\") " pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.618035 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.617925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fc5dde84-ff65-4a2b-8fa2-291004ae61c7-manager-config\") pod \"lws-controller-manager-7fd7474bdd-99k2r\" (UID: \"fc5dde84-ff65-4a2b-8fa2-291004ae61c7\") " pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.618035 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.617948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc5dde84-ff65-4a2b-8fa2-291004ae61c7-metrics-cert\") pod \"lws-controller-manager-7fd7474bdd-99k2r\" (UID: \"fc5dde84-ff65-4a2b-8fa2-291004ae61c7\") " pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.618035 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.618001 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc5dde84-ff65-4a2b-8fa2-291004ae61c7-cert\") pod \"lws-controller-manager-7fd7474bdd-99k2r\" (UID: \"fc5dde84-ff65-4a2b-8fa2-291004ae61c7\") " pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.718806 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.718766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fc5dde84-ff65-4a2b-8fa2-291004ae61c7-manager-config\") pod \"lws-controller-manager-7fd7474bdd-99k2r\" (UID: \"fc5dde84-ff65-4a2b-8fa2-291004ae61c7\") " pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.718806 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.718814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc5dde84-ff65-4a2b-8fa2-291004ae61c7-metrics-cert\") pod \"lws-controller-manager-7fd7474bdd-99k2r\" (UID: \"fc5dde84-ff65-4a2b-8fa2-291004ae61c7\") " pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.719013 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.718838 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc5dde84-ff65-4a2b-8fa2-291004ae61c7-cert\") pod \"lws-controller-manager-7fd7474bdd-99k2r\" (UID: \"fc5dde84-ff65-4a2b-8fa2-291004ae61c7\") " pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.719013 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.718892 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w65ww\" (UniqueName: \"kubernetes.io/projected/fc5dde84-ff65-4a2b-8fa2-291004ae61c7-kube-api-access-w65ww\") pod \"lws-controller-manager-7fd7474bdd-99k2r\" (UID: \"fc5dde84-ff65-4a2b-8fa2-291004ae61c7\") " pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.719368 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.719349 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fc5dde84-ff65-4a2b-8fa2-291004ae61c7-manager-config\") pod \"lws-controller-manager-7fd7474bdd-99k2r\" (UID: \"fc5dde84-ff65-4a2b-8fa2-291004ae61c7\") " pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.721268 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.721244 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc5dde84-ff65-4a2b-8fa2-291004ae61c7-metrics-cert\") pod \"lws-controller-manager-7fd7474bdd-99k2r\" (UID: \"fc5dde84-ff65-4a2b-8fa2-291004ae61c7\") " pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.721347 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.721312 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc5dde84-ff65-4a2b-8fa2-291004ae61c7-cert\") pod \"lws-controller-manager-7fd7474bdd-99k2r\" (UID: \"fc5dde84-ff65-4a2b-8fa2-291004ae61c7\") " pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.730199 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.730180 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w65ww\" (UniqueName: \"kubernetes.io/projected/fc5dde84-ff65-4a2b-8fa2-291004ae61c7-kube-api-access-w65ww\") pod \"lws-controller-manager-7fd7474bdd-99k2r\" (UID: \"fc5dde84-ff65-4a2b-8fa2-291004ae61c7\") " pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.862811 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.862731 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:24.989891 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:24.989860 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r"] May 06 17:15:24.992233 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:15:24.992203 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc5dde84_ff65_4a2b_8fa2_291004ae61c7.slice/crio-79d735f7035fe2b9831c09fd4ccce8019074c832b2365034dbce426b54e85b71 WatchSource:0}: Error finding container 79d735f7035fe2b9831c09fd4ccce8019074c832b2365034dbce426b54e85b71: Status 404 returned error can't find the container with id 79d735f7035fe2b9831c09fd4ccce8019074c832b2365034dbce426b54e85b71 May 06 17:15:25.849275 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:25.849239 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" event={"ID":"fc5dde84-ff65-4a2b-8fa2-291004ae61c7","Type":"ContainerStarted","Data":"79d735f7035fe2b9831c09fd4ccce8019074c832b2365034dbce426b54e85b71"} May 06 17:15:26.902776 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:26.902647 2578 kubelet.go:1628] "Image garbage collection succeeded" May 06 17:15:27.856493 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:27.856454 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" event={"ID":"fc5dde84-ff65-4a2b-8fa2-291004ae61c7","Type":"ContainerStarted","Data":"bf88d2ca112c622bd3acb53fb2eca04a91e61d353c53c57a1488a9980f662873"} May 06 17:15:27.856678 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:27.856557 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:27.874247 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:27.874205 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" podStartSLOduration=1.930669578 podStartE2EDuration="3.874194093s" podCreationTimestamp="2026-05-06 17:15:24 +0000 UTC" firstStartedPulling="2026-05-06 17:15:24.993980759 +0000 UTC m=+298.635378335" lastFinishedPulling="2026-05-06 17:15:26.937505261 +0000 UTC m=+300.578902850" observedRunningTime="2026-05-06 17:15:27.874094771 +0000 UTC m=+301.515492369" watchObservedRunningTime="2026-05-06 17:15:27.874194093 +0000 UTC m=+301.515591690" May 06 17:15:35.448623 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.448534 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v"] May 06 17:15:35.452104 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.452083 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" May 06 17:15:35.454747 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.454724 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" May 06 17:15:35.454865 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.454738 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" May 06 17:15:35.454865 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.454732 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-spdkz\"" May 06 17:15:35.454865 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.454786 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" May 06 17:15:35.454865 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.454798 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" May 06 17:15:35.461093 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.461072 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v"] May 06 17:15:35.500396 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.500365 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z26kd\" (UniqueName: \"kubernetes.io/projected/04562d63-745b-45e8-9842-354b822ed447-kube-api-access-z26kd\") pod \"opendatahub-operator-controller-manager-698574c4f-98m5v\" (UID: \"04562d63-745b-45e8-9842-354b822ed447\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" May 06 17:15:35.500526 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.500422 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04562d63-745b-45e8-9842-354b822ed447-webhook-cert\") pod \"opendatahub-operator-controller-manager-698574c4f-98m5v\" (UID: \"04562d63-745b-45e8-9842-354b822ed447\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" May 06 17:15:35.500526 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.500498 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04562d63-745b-45e8-9842-354b822ed447-apiservice-cert\") pod \"opendatahub-operator-controller-manager-698574c4f-98m5v\" (UID: \"04562d63-745b-45e8-9842-354b822ed447\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" May 06 17:15:35.601659 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.601633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04562d63-745b-45e8-9842-354b822ed447-apiservice-cert\") pod \"opendatahub-operator-controller-manager-698574c4f-98m5v\" (UID: \"04562d63-745b-45e8-9842-354b822ed447\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" May 06 17:15:35.601766 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.601684 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z26kd\" (UniqueName: \"kubernetes.io/projected/04562d63-745b-45e8-9842-354b822ed447-kube-api-access-z26kd\") pod \"opendatahub-operator-controller-manager-698574c4f-98m5v\" (UID: \"04562d63-745b-45e8-9842-354b822ed447\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" May 06 17:15:35.601766 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.601718 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04562d63-745b-45e8-9842-354b822ed447-webhook-cert\") pod \"opendatahub-operator-controller-manager-698574c4f-98m5v\" (UID: \"04562d63-745b-45e8-9842-354b822ed447\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" May 06 17:15:35.604052 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.604024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04562d63-745b-45e8-9842-354b822ed447-apiservice-cert\") pod \"opendatahub-operator-controller-manager-698574c4f-98m5v\" (UID: \"04562d63-745b-45e8-9842-354b822ed447\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" May 06 17:15:35.604052 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.604049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04562d63-745b-45e8-9842-354b822ed447-webhook-cert\") pod \"opendatahub-operator-controller-manager-698574c4f-98m5v\" (UID: \"04562d63-745b-45e8-9842-354b822ed447\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" May 06 17:15:35.623373 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.623351 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z26kd\" (UniqueName: \"kubernetes.io/projected/04562d63-745b-45e8-9842-354b822ed447-kube-api-access-z26kd\") pod \"opendatahub-operator-controller-manager-698574c4f-98m5v\" (UID: \"04562d63-745b-45e8-9842-354b822ed447\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" May 06 17:15:35.763802 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.763730 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" May 06 17:15:35.914053 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.914031 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v"] May 06 17:15:35.916550 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:15:35.916525 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04562d63_745b_45e8_9842_354b822ed447.slice/crio-3d86e9de25e02da18fd2a65059b97a79dc43e56701ffe761844fb7ab2f6bd9ef WatchSource:0}: Error finding container 3d86e9de25e02da18fd2a65059b97a79dc43e56701ffe761844fb7ab2f6bd9ef: Status 404 returned error can't find the container with id 3d86e9de25e02da18fd2a65059b97a79dc43e56701ffe761844fb7ab2f6bd9ef May 06 17:15:35.918363 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:35.918345 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 06 17:15:36.885365 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:36.885308 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" event={"ID":"04562d63-745b-45e8-9842-354b822ed447","Type":"ContainerStarted","Data":"3d86e9de25e02da18fd2a65059b97a79dc43e56701ffe761844fb7ab2f6bd9ef"} May 06 17:15:38.861515 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:38.861440 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7fd7474bdd-99k2r" May 06 17:15:38.891689 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:38.891656 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" event={"ID":"04562d63-745b-45e8-9842-354b822ed447","Type":"ContainerStarted","Data":"e0dc3372f259fa7be2b052ae6f8b9f891ba6b55fb97f85e13be17e0ea9e2672f"} May 06 17:15:38.891853 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:38.891803 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" May 06 17:15:38.913192 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:38.913142 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" podStartSLOduration=1.273317026 podStartE2EDuration="3.913123531s" podCreationTimestamp="2026-05-06 17:15:35 +0000 UTC" firstStartedPulling="2026-05-06 17:15:35.918516188 +0000 UTC m=+309.559913764" lastFinishedPulling="2026-05-06 17:15:38.558322686 +0000 UTC m=+312.199720269" observedRunningTime="2026-05-06 17:15:38.912110405 +0000 UTC m=+312.553508005" watchObservedRunningTime="2026-05-06 17:15:38.913123531 +0000 UTC m=+312.554521114" May 06 17:15:49.898317 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:49.898288 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-98m5v" May 06 17:15:56.327534 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.327501 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv"] May 06 17:15:56.336803 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.336781 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" May 06 17:15:56.339651 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.339624 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" May 06 17:15:56.340138 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.340116 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv"] May 06 17:15:56.341038 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.341015 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-6grhb\"" May 06 17:15:56.341038 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.341031 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" May 06 17:15:56.341185 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.341015 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" May 06 17:15:56.341185 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.341038 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" May 06 17:15:56.482452 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.482418 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5-tls-certs\") pod \"kube-auth-proxy-69f8cf9d8c-4vzkv\" (UID: \"9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" May 06 17:15:56.482452 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.482454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb8hq\" (UniqueName: \"kubernetes.io/projected/9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5-kube-api-access-sb8hq\") pod \"kube-auth-proxy-69f8cf9d8c-4vzkv\" (UID: \"9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" May 06 17:15:56.482687 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.482478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5-tmp\") pod \"kube-auth-proxy-69f8cf9d8c-4vzkv\" (UID: \"9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" May 06 17:15:56.583496 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.583415 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5-tls-certs\") pod \"kube-auth-proxy-69f8cf9d8c-4vzkv\" (UID: \"9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" May 06 17:15:56.583496 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.583452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sb8hq\" (UniqueName: \"kubernetes.io/projected/9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5-kube-api-access-sb8hq\") pod \"kube-auth-proxy-69f8cf9d8c-4vzkv\" (UID: \"9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" May 06 17:15:56.583496 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.583478 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5-tmp\") pod \"kube-auth-proxy-69f8cf9d8c-4vzkv\" (UID: \"9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" May 06 17:15:56.585762 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.585738 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5-tmp\") pod \"kube-auth-proxy-69f8cf9d8c-4vzkv\" (UID: \"9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" May 06 17:15:56.586095 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.586064 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5-tls-certs\") pod \"kube-auth-proxy-69f8cf9d8c-4vzkv\" (UID: \"9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" May 06 17:15:56.595004 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.594982 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb8hq\" (UniqueName: \"kubernetes.io/projected/9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5-kube-api-access-sb8hq\") pod \"kube-auth-proxy-69f8cf9d8c-4vzkv\" (UID: \"9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" May 06 17:15:56.649200 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.649169 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" May 06 17:15:56.792478 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.792448 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv"] May 06 17:15:56.795527 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:15:56.795497 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f53b9e5_5c24_4c3d_a6f0_6c1a66e9b1d5.slice/crio-585f80b46584caf132b0ff9eaa14794c91b9e4be7c5601258c417afcb185e0b3 WatchSource:0}: Error finding container 585f80b46584caf132b0ff9eaa14794c91b9e4be7c5601258c417afcb185e0b3: Status 404 returned error can't find the container with id 585f80b46584caf132b0ff9eaa14794c91b9e4be7c5601258c417afcb185e0b3 May 06 17:15:56.952957 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:15:56.952920 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" event={"ID":"9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5","Type":"ContainerStarted","Data":"585f80b46584caf132b0ff9eaa14794c91b9e4be7c5601258c417afcb185e0b3"} May 06 17:16:00.969348 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:16:00.969314 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" event={"ID":"9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5","Type":"ContainerStarted","Data":"e4d40aae27cc118e206377ebfaff74767f274335553553f109bc382f41fc82bb"} May 06 17:16:00.995438 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:16:00.995383 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-4vzkv" podStartSLOduration=1.5915654510000001 podStartE2EDuration="4.99536861s" podCreationTimestamp="2026-05-06 17:15:56 +0000 UTC" firstStartedPulling="2026-05-06 17:15:56.797172605 +0000 UTC m=+330.438570186" lastFinishedPulling="2026-05-06 17:16:00.200975765 +0000 UTC m=+333.842373345" observedRunningTime="2026-05-06 17:16:00.993382018 +0000 UTC m=+334.634779619" watchObservedRunningTime="2026-05-06 17:16:00.99536861 +0000 UTC m=+334.636766208" May 06 17:17:15.497047 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.497016 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c64c884c-pbc48"] May 06 17:17:15.499330 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.499314 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.520089 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.520064 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c64c884c-pbc48"] May 06 17:17:15.543317 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.543296 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-console-config\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.543416 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.543331 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-trusted-ca-bundle\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.543416 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.543359 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjsng\" (UniqueName: \"kubernetes.io/projected/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-kube-api-access-wjsng\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.543416 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.543402 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-console-oauth-config\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.543543 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.543422 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-service-ca\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.543543 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.543461 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-oauth-serving-cert\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.543543 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.543532 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-console-serving-cert\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.644595 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.644565 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjsng\" (UniqueName: \"kubernetes.io/projected/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-kube-api-access-wjsng\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.644703 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.644624 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-console-oauth-config\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.644703 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.644645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-service-ca\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.644703 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.644660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-oauth-serving-cert\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.644703 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.644697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-console-serving-cert\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.644873 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.644850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-console-config\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.644916 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.644907 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-trusted-ca-bundle\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.645403 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.645367 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-service-ca\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.645499 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.645446 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-oauth-serving-cert\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.645499 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.645470 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-console-config\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.645672 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.645656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-trusted-ca-bundle\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.647150 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.647133 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-console-serving-cert\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.647273 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.647256 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-console-oauth-config\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.654293 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.654277 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjsng\" (UniqueName: \"kubernetes.io/projected/18c36a0e-cbf5-45a5-bb07-25bd57a01f2d-kube-api-access-wjsng\") pod \"console-6c64c884c-pbc48\" (UID: \"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d\") " pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.808498 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.808416 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:15.955128 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:15.955103 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c64c884c-pbc48"] May 06 17:17:15.956632 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:17:15.956601 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18c36a0e_cbf5_45a5_bb07_25bd57a01f2d.slice/crio-2a53643d06e514a352aec7c135626d0dabdda95b95048e540f5bb484018d41f6 WatchSource:0}: Error finding container 2a53643d06e514a352aec7c135626d0dabdda95b95048e540f5bb484018d41f6: Status 404 returned error can't find the container with id 2a53643d06e514a352aec7c135626d0dabdda95b95048e540f5bb484018d41f6 May 06 17:17:16.216311 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:16.216273 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c64c884c-pbc48" event={"ID":"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d","Type":"ContainerStarted","Data":"7c22ef38fb25612a72de8b520b166f15c2268dc04f478944b0f048df0022f964"} May 06 17:17:16.216478 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:16.216317 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c64c884c-pbc48" event={"ID":"18c36a0e-cbf5-45a5-bb07-25bd57a01f2d","Type":"ContainerStarted","Data":"2a53643d06e514a352aec7c135626d0dabdda95b95048e540f5bb484018d41f6"} May 06 17:17:16.244015 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:16.243958 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c64c884c-pbc48" podStartSLOduration=1.243940223 podStartE2EDuration="1.243940223s" podCreationTimestamp="2026-05-06 17:17:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:17:16.241980776 +0000 UTC m=+409.883378374" watchObservedRunningTime="2026-05-06 17:17:16.243940223 +0000 UTC m=+409.885337822" May 06 17:17:25.808850 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:25.808807 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:25.809244 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:25.808862 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:25.813325 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:25.813302 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:26.252370 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:26.252341 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c64c884c-pbc48" May 06 17:17:26.328499 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:26.328467 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d47b68768-6v9ml"] May 06 17:17:50.978287 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:50.978197 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw"] May 06 17:17:50.980910 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:50.980895 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw" May 06 17:17:50.985552 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:50.985530 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" May 06 17:17:50.994370 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:50.994354 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" May 06 17:17:50.998151 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:50.998136 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-tx98p\"" May 06 17:17:51.050867 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.050833 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q4g4\" (UniqueName: \"kubernetes.io/projected/4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6-kube-api-access-5q4g4\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6q9bw\" (UID: \"4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw" May 06 17:17:51.051021 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.050905 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6q9bw\" (UID: \"4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw" May 06 17:17:51.152125 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.152084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q4g4\" (UniqueName: \"kubernetes.io/projected/4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6-kube-api-access-5q4g4\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6q9bw\" (UID: \"4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw" May 06 17:17:51.152304 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.152183 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6q9bw\" (UID: \"4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw" May 06 17:17:51.152554 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.152534 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6q9bw\" (UID: \"4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw" May 06 17:17:51.157530 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.157503 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw"] May 06 17:17:51.174095 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.174060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q4g4\" (UniqueName: \"kubernetes.io/projected/4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6-kube-api-access-5q4g4\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6q9bw\" (UID: \"4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw" May 06 17:17:51.290987 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.290906 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw" May 06 17:17:51.348182 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.348128 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d47b68768-6v9ml" podUID="e84f004d-21a8-4850-8cbf-e27ddaa4ec19" containerName="console" containerID="cri-o://2c9ae22564c2adb3afb7b279cb2cf05c6fb5d8ddf082301b9cad39e5a2607eb7" gracePeriod=15 May 06 17:17:51.435232 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.435199 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw"] May 06 17:17:51.438309 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:17:51.438280 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f8a3fae_4bf9_44f4_b3a2_a33bf87984f6.slice/crio-9cf7e1a58a08f4d963ffca774ced7db51ef3969232d8ed53a92f244d716d9e90 WatchSource:0}: Error finding container 9cf7e1a58a08f4d963ffca774ced7db51ef3969232d8ed53a92f244d716d9e90: Status 404 returned error can't find the container with id 9cf7e1a58a08f4d963ffca774ced7db51ef3969232d8ed53a92f244d716d9e90 May 06 17:17:51.574576 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.574556 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d47b68768-6v9ml_e84f004d-21a8-4850-8cbf-e27ddaa4ec19/console/0.log" May 06 17:17:51.574706 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.574629 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:17:51.655349 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.655314 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-service-ca\") pod \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " May 06 17:17:51.655514 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.655357 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-oauth-serving-cert\") pod \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " May 06 17:17:51.655514 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.655386 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-oauth-config\") pod \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " May 06 17:17:51.655514 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.655403 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-trusted-ca-bundle\") pod \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " May 06 17:17:51.655701 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.655573 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wsbt\" (UniqueName: \"kubernetes.io/projected/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-kube-api-access-4wsbt\") pod \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " May 06 17:17:51.655701 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.655641 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-config\") pod \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " May 06 17:17:51.655701 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.655686 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-serving-cert\") pod \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\" (UID: \"e84f004d-21a8-4850-8cbf-e27ddaa4ec19\") " May 06 17:17:51.655844 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.655768 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-service-ca" (OuterVolumeSpecName: "service-ca") pod "e84f004d-21a8-4850-8cbf-e27ddaa4ec19" (UID: "e84f004d-21a8-4850-8cbf-e27ddaa4ec19"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:17:51.655844 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.655824 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e84f004d-21a8-4850-8cbf-e27ddaa4ec19" (UID: "e84f004d-21a8-4850-8cbf-e27ddaa4ec19"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:17:51.655944 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.655916 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e84f004d-21a8-4850-8cbf-e27ddaa4ec19" (UID: "e84f004d-21a8-4850-8cbf-e27ddaa4ec19"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:17:51.656063 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.656036 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-config" (OuterVolumeSpecName: "console-config") pod "e84f004d-21a8-4850-8cbf-e27ddaa4ec19" (UID: "e84f004d-21a8-4850-8cbf-e27ddaa4ec19"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:17:51.656178 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.656113 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-config\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:17:51.656178 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.656132 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-service-ca\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:17:51.656178 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.656143 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-oauth-serving-cert\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:17:51.656178 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.656152 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-trusted-ca-bundle\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:17:51.657647 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.657623 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e84f004d-21a8-4850-8cbf-e27ddaa4ec19" (UID: "e84f004d-21a8-4850-8cbf-e27ddaa4ec19"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:17:51.657704 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.657643 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-kube-api-access-4wsbt" (OuterVolumeSpecName: "kube-api-access-4wsbt") pod "e84f004d-21a8-4850-8cbf-e27ddaa4ec19" (UID: "e84f004d-21a8-4850-8cbf-e27ddaa4ec19"). InnerVolumeSpecName "kube-api-access-4wsbt". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:17:51.657776 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.657756 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e84f004d-21a8-4850-8cbf-e27ddaa4ec19" (UID: "e84f004d-21a8-4850-8cbf-e27ddaa4ec19"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:17:51.757331 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.757299 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-oauth-config\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:17:51.757331 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.757325 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4wsbt\" (UniqueName: \"kubernetes.io/projected/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-kube-api-access-4wsbt\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:17:51.757331 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:51.757335 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e84f004d-21a8-4850-8cbf-e27ddaa4ec19-console-serving-cert\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:17:52.333812 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:52.333777 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw" event={"ID":"4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6","Type":"ContainerStarted","Data":"9cf7e1a58a08f4d963ffca774ced7db51ef3969232d8ed53a92f244d716d9e90"} May 06 17:17:52.335144 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:52.335121 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d47b68768-6v9ml_e84f004d-21a8-4850-8cbf-e27ddaa4ec19/console/0.log" May 06 17:17:52.335242 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:52.335169 2578 generic.go:358] "Generic (PLEG): container finished" podID="e84f004d-21a8-4850-8cbf-e27ddaa4ec19" containerID="2c9ae22564c2adb3afb7b279cb2cf05c6fb5d8ddf082301b9cad39e5a2607eb7" exitCode=2 May 06 17:17:52.335242 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:52.335201 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d47b68768-6v9ml" event={"ID":"e84f004d-21a8-4850-8cbf-e27ddaa4ec19","Type":"ContainerDied","Data":"2c9ae22564c2adb3afb7b279cb2cf05c6fb5d8ddf082301b9cad39e5a2607eb7"} May 06 17:17:52.335242 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:52.335224 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d47b68768-6v9ml" event={"ID":"e84f004d-21a8-4850-8cbf-e27ddaa4ec19","Type":"ContainerDied","Data":"14d6b0afa7f6152c34ea055130e2817fbff2df8064e0b15d0d7ebc9f66259f90"} May 06 17:17:52.335242 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:52.335232 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d47b68768-6v9ml" May 06 17:17:52.335242 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:52.335241 2578 scope.go:117] "RemoveContainer" containerID="2c9ae22564c2adb3afb7b279cb2cf05c6fb5d8ddf082301b9cad39e5a2607eb7" May 06 17:17:52.358150 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:52.358123 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d47b68768-6v9ml"] May 06 17:17:52.363976 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:52.363952 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d47b68768-6v9ml"] May 06 17:17:52.390737 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:52.390712 2578 scope.go:117] "RemoveContainer" containerID="2c9ae22564c2adb3afb7b279cb2cf05c6fb5d8ddf082301b9cad39e5a2607eb7" May 06 17:17:52.391068 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:17:52.391043 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c9ae22564c2adb3afb7b279cb2cf05c6fb5d8ddf082301b9cad39e5a2607eb7\": container with ID starting with 2c9ae22564c2adb3afb7b279cb2cf05c6fb5d8ddf082301b9cad39e5a2607eb7 not found: ID does not exist" containerID="2c9ae22564c2adb3afb7b279cb2cf05c6fb5d8ddf082301b9cad39e5a2607eb7" May 06 17:17:52.391149 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:52.391078 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c9ae22564c2adb3afb7b279cb2cf05c6fb5d8ddf082301b9cad39e5a2607eb7"} err="failed to get container status \"2c9ae22564c2adb3afb7b279cb2cf05c6fb5d8ddf082301b9cad39e5a2607eb7\": rpc error: code = NotFound desc = could not find container \"2c9ae22564c2adb3afb7b279cb2cf05c6fb5d8ddf082301b9cad39e5a2607eb7\": container with ID starting with 2c9ae22564c2adb3afb7b279cb2cf05c6fb5d8ddf082301b9cad39e5a2607eb7 not found: ID does not exist" May 06 17:17:52.885127 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:52.885096 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e84f004d-21a8-4850-8cbf-e27ddaa4ec19" path="/var/lib/kubelet/pods/e84f004d-21a8-4850-8cbf-e27ddaa4ec19/volumes" May 06 17:17:56.354327 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:56.354292 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw" event={"ID":"4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6","Type":"ContainerStarted","Data":"e1920a1aa625e94b0afd9be45157fc6d0f10af5a1b65aaf5e1cab55b98814553"} May 06 17:17:56.354718 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:56.354439 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw" May 06 17:17:56.378191 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:17:56.378133 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw" podStartSLOduration=2.260982915 podStartE2EDuration="6.378118768s" podCreationTimestamp="2026-05-06 17:17:50 +0000 UTC" firstStartedPulling="2026-05-06 17:17:51.454102208 +0000 UTC m=+445.095499798" lastFinishedPulling="2026-05-06 17:17:55.571238059 +0000 UTC m=+449.212635651" observedRunningTime="2026-05-06 17:17:56.376892469 +0000 UTC m=+450.018290311" watchObservedRunningTime="2026-05-06 17:17:56.378118768 +0000 UTC m=+450.019516367" May 06 17:18:07.360244 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:07.360210 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6q9bw" May 06 17:18:29.746097 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:29.746050 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-768b6"] May 06 17:18:29.746573 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:29.746552 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e84f004d-21a8-4850-8cbf-e27ddaa4ec19" containerName="console" May 06 17:18:29.746573 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:29.746571 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84f004d-21a8-4850-8cbf-e27ddaa4ec19" containerName="console" May 06 17:18:29.746731 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:29.746717 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e84f004d-21a8-4850-8cbf-e27ddaa4ec19" containerName="console" May 06 17:18:29.750769 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:29.750747 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-768b6" May 06 17:18:29.753508 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:29.753481 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-k56m8\"" May 06 17:18:29.766976 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:29.766951 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-768b6"] May 06 17:18:29.803688 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:29.803662 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm2gg\" (UniqueName: \"kubernetes.io/projected/c1de9597-172f-4ab0-84a4-306431723da0-kube-api-access-nm2gg\") pod \"authorino-f99f4b5cd-768b6\" (UID: \"c1de9597-172f-4ab0-84a4-306431723da0\") " pod="kuadrant-system/authorino-f99f4b5cd-768b6" May 06 17:18:29.904489 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:29.904453 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nm2gg\" (UniqueName: \"kubernetes.io/projected/c1de9597-172f-4ab0-84a4-306431723da0-kube-api-access-nm2gg\") pod \"authorino-f99f4b5cd-768b6\" (UID: \"c1de9597-172f-4ab0-84a4-306431723da0\") " pod="kuadrant-system/authorino-f99f4b5cd-768b6" May 06 17:18:29.917242 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:29.917212 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm2gg\" (UniqueName: \"kubernetes.io/projected/c1de9597-172f-4ab0-84a4-306431723da0-kube-api-access-nm2gg\") pod \"authorino-f99f4b5cd-768b6\" (UID: \"c1de9597-172f-4ab0-84a4-306431723da0\") " pod="kuadrant-system/authorino-f99f4b5cd-768b6" May 06 17:18:30.059468 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:30.059381 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-768b6" May 06 17:18:30.201909 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:30.201881 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-768b6"] May 06 17:18:30.204076 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:18:30.204047 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1de9597_172f_4ab0_84a4_306431723da0.slice/crio-91b19af289e7a0733ed2c7ba329801695b02794a4d675084a497bf5508240fb8 WatchSource:0}: Error finding container 91b19af289e7a0733ed2c7ba329801695b02794a4d675084a497bf5508240fb8: Status 404 returned error can't find the container with id 91b19af289e7a0733ed2c7ba329801695b02794a4d675084a497bf5508240fb8 May 06 17:18:30.467891 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:30.467853 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-768b6" event={"ID":"c1de9597-172f-4ab0-84a4-306431723da0","Type":"ContainerStarted","Data":"91b19af289e7a0733ed2c7ba329801695b02794a4d675084a497bf5508240fb8"} May 06 17:18:33.479555 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:33.479520 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-768b6" event={"ID":"c1de9597-172f-4ab0-84a4-306431723da0","Type":"ContainerStarted","Data":"91013454bf83482bde73108ea5473fb727f308fd719628f0710f7dfc5b34a4e5"} May 06 17:18:33.503254 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:33.503159 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-768b6" podStartSLOduration=1.473921351 podStartE2EDuration="4.503144289s" podCreationTimestamp="2026-05-06 17:18:29 +0000 UTC" firstStartedPulling="2026-05-06 17:18:30.205316591 +0000 UTC m=+483.846714167" lastFinishedPulling="2026-05-06 17:18:33.234539529 +0000 UTC m=+486.875937105" observedRunningTime="2026-05-06 17:18:33.5003774 +0000 UTC m=+487.141774998" watchObservedRunningTime="2026-05-06 17:18:33.503144289 +0000 UTC m=+487.144541897" May 06 17:18:35.916368 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:35.916333 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-768b6"] May 06 17:18:35.916897 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:35.916513 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-768b6" podUID="c1de9597-172f-4ab0-84a4-306431723da0" containerName="authorino" containerID="cri-o://91013454bf83482bde73108ea5473fb727f308fd719628f0710f7dfc5b34a4e5" gracePeriod=30 May 06 17:18:36.152510 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.152489 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-768b6" May 06 17:18:36.263786 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.263754 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm2gg\" (UniqueName: \"kubernetes.io/projected/c1de9597-172f-4ab0-84a4-306431723da0-kube-api-access-nm2gg\") pod \"c1de9597-172f-4ab0-84a4-306431723da0\" (UID: \"c1de9597-172f-4ab0-84a4-306431723da0\") " May 06 17:18:36.265840 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.265812 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1de9597-172f-4ab0-84a4-306431723da0-kube-api-access-nm2gg" (OuterVolumeSpecName: "kube-api-access-nm2gg") pod "c1de9597-172f-4ab0-84a4-306431723da0" (UID: "c1de9597-172f-4ab0-84a4-306431723da0"). InnerVolumeSpecName "kube-api-access-nm2gg". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:18:36.364628 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.364603 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nm2gg\" (UniqueName: \"kubernetes.io/projected/c1de9597-172f-4ab0-84a4-306431723da0-kube-api-access-nm2gg\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:18:36.494487 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.494447 2578 generic.go:358] "Generic (PLEG): container finished" podID="c1de9597-172f-4ab0-84a4-306431723da0" containerID="91013454bf83482bde73108ea5473fb727f308fd719628f0710f7dfc5b34a4e5" exitCode=0 May 06 17:18:36.494679 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.494509 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-768b6" May 06 17:18:36.494679 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.494554 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-768b6" event={"ID":"c1de9597-172f-4ab0-84a4-306431723da0","Type":"ContainerDied","Data":"91013454bf83482bde73108ea5473fb727f308fd719628f0710f7dfc5b34a4e5"} May 06 17:18:36.494679 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.494603 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-768b6" event={"ID":"c1de9597-172f-4ab0-84a4-306431723da0","Type":"ContainerDied","Data":"91b19af289e7a0733ed2c7ba329801695b02794a4d675084a497bf5508240fb8"} May 06 17:18:36.494679 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.494622 2578 scope.go:117] "RemoveContainer" containerID="91013454bf83482bde73108ea5473fb727f308fd719628f0710f7dfc5b34a4e5" May 06 17:18:36.502535 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.502513 2578 scope.go:117] "RemoveContainer" containerID="91013454bf83482bde73108ea5473fb727f308fd719628f0710f7dfc5b34a4e5" May 06 17:18:36.502813 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:18:36.502792 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91013454bf83482bde73108ea5473fb727f308fd719628f0710f7dfc5b34a4e5\": container with ID starting with 91013454bf83482bde73108ea5473fb727f308fd719628f0710f7dfc5b34a4e5 not found: ID does not exist" containerID="91013454bf83482bde73108ea5473fb727f308fd719628f0710f7dfc5b34a4e5" May 06 17:18:36.502902 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.502819 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91013454bf83482bde73108ea5473fb727f308fd719628f0710f7dfc5b34a4e5"} err="failed to get container status \"91013454bf83482bde73108ea5473fb727f308fd719628f0710f7dfc5b34a4e5\": rpc error: code = NotFound desc = could not find container \"91013454bf83482bde73108ea5473fb727f308fd719628f0710f7dfc5b34a4e5\": container with ID starting with 91013454bf83482bde73108ea5473fb727f308fd719628f0710f7dfc5b34a4e5 not found: ID does not exist" May 06 17:18:36.529495 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.529429 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-768b6"] May 06 17:18:36.537304 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.537282 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-768b6"] May 06 17:18:36.886231 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:18:36.886155 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1de9597-172f-4ab0-84a4-306431723da0" path="/var/lib/kubelet/pods/c1de9597-172f-4ab0-84a4-306431723da0/volumes" May 06 17:19:03.566483 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.566448 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-gwbpd"] May 06 17:19:03.566921 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.566796 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1de9597-172f-4ab0-84a4-306431723da0" containerName="authorino" May 06 17:19:03.566921 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.566807 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1de9597-172f-4ab0-84a4-306431723da0" containerName="authorino" May 06 17:19:03.566921 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.566865 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1de9597-172f-4ab0-84a4-306431723da0" containerName="authorino" May 06 17:19:03.619433 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.619397 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-gwbpd"] May 06 17:19:03.619619 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.619499 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-gwbpd" May 06 17:19:03.622281 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.622254 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-k56m8\"" May 06 17:19:03.702766 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.702736 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdqtg\" (UniqueName: \"kubernetes.io/projected/3bc9f22e-a152-4bda-ba87-df95fd710c7c-kube-api-access-bdqtg\") pod \"authorino-8b475cf9f-gwbpd\" (UID: \"3bc9f22e-a152-4bda-ba87-df95fd710c7c\") " pod="kuadrant-system/authorino-8b475cf9f-gwbpd" May 06 17:19:03.803229 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.803190 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdqtg\" (UniqueName: \"kubernetes.io/projected/3bc9f22e-a152-4bda-ba87-df95fd710c7c-kube-api-access-bdqtg\") pod \"authorino-8b475cf9f-gwbpd\" (UID: \"3bc9f22e-a152-4bda-ba87-df95fd710c7c\") " pod="kuadrant-system/authorino-8b475cf9f-gwbpd" May 06 17:19:03.819747 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.819684 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdqtg\" (UniqueName: \"kubernetes.io/projected/3bc9f22e-a152-4bda-ba87-df95fd710c7c-kube-api-access-bdqtg\") pod \"authorino-8b475cf9f-gwbpd\" (UID: \"3bc9f22e-a152-4bda-ba87-df95fd710c7c\") " pod="kuadrant-system/authorino-8b475cf9f-gwbpd" May 06 17:19:03.825857 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.825829 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-gwbpd"] May 06 17:19:03.826113 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.826099 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-gwbpd" May 06 17:19:03.854990 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.854962 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-67864dc994-flqfq"] May 06 17:19:03.886163 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.886128 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-67864dc994-flqfq"] May 06 17:19:03.886330 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.886251 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-67864dc994-flqfq" May 06 17:19:03.922454 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.922425 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-67864dc994-flqfq"] May 06 17:19:03.922748 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:19:03.922725 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cq6nt], unattached volumes=[], failed to process volumes=[kube-api-access-cq6nt]: context canceled" pod="kuadrant-system/authorino-67864dc994-flqfq" podUID="69d26602-5297-4c4e-9d41-a08b6676c3a8" May 06 17:19:03.952451 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.952419 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-67b5fd54c8-ppr72"] May 06 17:19:03.956833 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:19:03.956805 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bc9f22e_a152_4bda_ba87_df95fd710c7c.slice/crio-bac0ed1e0be25cb7e392643d36ccfadccc7d88313195e4320dbd76d80638ad4b WatchSource:0}: Error finding container bac0ed1e0be25cb7e392643d36ccfadccc7d88313195e4320dbd76d80638ad4b: Status 404 returned error can't find the container with id bac0ed1e0be25cb7e392643d36ccfadccc7d88313195e4320dbd76d80638ad4b May 06 17:19:03.974988 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.974961 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-gwbpd"] May 06 17:19:03.975109 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.975078 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-67b5fd54c8-ppr72" May 06 17:19:03.976309 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.976286 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-67b5fd54c8-ppr72"] May 06 17:19:03.980292 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:03.980270 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" May 06 17:19:04.005197 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.005162 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq6nt\" (UniqueName: \"kubernetes.io/projected/69d26602-5297-4c4e-9d41-a08b6676c3a8-kube-api-access-cq6nt\") pod \"authorino-67864dc994-flqfq\" (UID: \"69d26602-5297-4c4e-9d41-a08b6676c3a8\") " pod="kuadrant-system/authorino-67864dc994-flqfq" May 06 17:19:04.105907 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.105817 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vnj2\" (UniqueName: \"kubernetes.io/projected/2b945d45-0a1f-46d3-a1a2-9c207d9d0710-kube-api-access-5vnj2\") pod \"authorino-67b5fd54c8-ppr72\" (UID: \"2b945d45-0a1f-46d3-a1a2-9c207d9d0710\") " pod="kuadrant-system/authorino-67b5fd54c8-ppr72" May 06 17:19:04.105907 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.105886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cq6nt\" (UniqueName: \"kubernetes.io/projected/69d26602-5297-4c4e-9d41-a08b6676c3a8-kube-api-access-cq6nt\") pod \"authorino-67864dc994-flqfq\" (UID: \"69d26602-5297-4c4e-9d41-a08b6676c3a8\") " pod="kuadrant-system/authorino-67864dc994-flqfq" May 06 17:19:04.106134 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.105928 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2b945d45-0a1f-46d3-a1a2-9c207d9d0710-tls-cert\") pod \"authorino-67b5fd54c8-ppr72\" (UID: \"2b945d45-0a1f-46d3-a1a2-9c207d9d0710\") " pod="kuadrant-system/authorino-67b5fd54c8-ppr72" May 06 17:19:04.131862 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.131829 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq6nt\" (UniqueName: \"kubernetes.io/projected/69d26602-5297-4c4e-9d41-a08b6676c3a8-kube-api-access-cq6nt\") pod \"authorino-67864dc994-flqfq\" (UID: \"69d26602-5297-4c4e-9d41-a08b6676c3a8\") " pod="kuadrant-system/authorino-67864dc994-flqfq" May 06 17:19:04.206852 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.206815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vnj2\" (UniqueName: \"kubernetes.io/projected/2b945d45-0a1f-46d3-a1a2-9c207d9d0710-kube-api-access-5vnj2\") pod \"authorino-67b5fd54c8-ppr72\" (UID: \"2b945d45-0a1f-46d3-a1a2-9c207d9d0710\") " pod="kuadrant-system/authorino-67b5fd54c8-ppr72" May 06 17:19:04.207013 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.206871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2b945d45-0a1f-46d3-a1a2-9c207d9d0710-tls-cert\") pod \"authorino-67b5fd54c8-ppr72\" (UID: \"2b945d45-0a1f-46d3-a1a2-9c207d9d0710\") " pod="kuadrant-system/authorino-67b5fd54c8-ppr72" May 06 17:19:04.209124 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.209105 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2b945d45-0a1f-46d3-a1a2-9c207d9d0710-tls-cert\") pod \"authorino-67b5fd54c8-ppr72\" (UID: \"2b945d45-0a1f-46d3-a1a2-9c207d9d0710\") " pod="kuadrant-system/authorino-67b5fd54c8-ppr72" May 06 17:19:04.220137 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.220113 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vnj2\" (UniqueName: \"kubernetes.io/projected/2b945d45-0a1f-46d3-a1a2-9c207d9d0710-kube-api-access-5vnj2\") pod \"authorino-67b5fd54c8-ppr72\" (UID: \"2b945d45-0a1f-46d3-a1a2-9c207d9d0710\") " pod="kuadrant-system/authorino-67b5fd54c8-ppr72" May 06 17:19:04.284277 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.284251 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-67b5fd54c8-ppr72" May 06 17:19:04.408871 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.408845 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-67b5fd54c8-ppr72"] May 06 17:19:04.410770 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:19:04.410742 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b945d45_0a1f_46d3_a1a2_9c207d9d0710.slice/crio-2e729e5a5eb989cef749e98e8001d249a7e7efc9500d3418e3e70480fb0663da WatchSource:0}: Error finding container 2e729e5a5eb989cef749e98e8001d249a7e7efc9500d3418e3e70480fb0663da: Status 404 returned error can't find the container with id 2e729e5a5eb989cef749e98e8001d249a7e7efc9500d3418e3e70480fb0663da May 06 17:19:04.588981 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.588949 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-67b5fd54c8-ppr72" event={"ID":"2b945d45-0a1f-46d3-a1a2-9c207d9d0710","Type":"ContainerStarted","Data":"2e729e5a5eb989cef749e98e8001d249a7e7efc9500d3418e3e70480fb0663da"} May 06 17:19:04.590578 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.590547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-gwbpd" event={"ID":"3bc9f22e-a152-4bda-ba87-df95fd710c7c","Type":"ContainerStarted","Data":"72fff49d8bdaabb84722b8a2e8a618f43d20424a4c6217031086f96365d50358"} May 06 17:19:04.590746 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.590579 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-gwbpd" event={"ID":"3bc9f22e-a152-4bda-ba87-df95fd710c7c","Type":"ContainerStarted","Data":"bac0ed1e0be25cb7e392643d36ccfadccc7d88313195e4320dbd76d80638ad4b"} May 06 17:19:04.590746 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.590611 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-67864dc994-flqfq" May 06 17:19:04.590746 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.590612 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-gwbpd" podUID="3bc9f22e-a152-4bda-ba87-df95fd710c7c" containerName="authorino" containerID="cri-o://72fff49d8bdaabb84722b8a2e8a618f43d20424a4c6217031086f96365d50358" gracePeriod=30 May 06 17:19:04.595615 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.595497 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-67864dc994-flqfq" May 06 17:19:04.608343 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.608303 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-gwbpd" podStartSLOduration=1.084808148 podStartE2EDuration="1.60829112s" podCreationTimestamp="2026-05-06 17:19:03 +0000 UTC" firstStartedPulling="2026-05-06 17:19:03.958476115 +0000 UTC m=+517.599873702" lastFinishedPulling="2026-05-06 17:19:04.481959083 +0000 UTC m=+518.123356674" observedRunningTime="2026-05-06 17:19:04.606983447 +0000 UTC m=+518.248381046" watchObservedRunningTime="2026-05-06 17:19:04.60829112 +0000 UTC m=+518.249688718" May 06 17:19:04.711071 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.711046 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq6nt\" (UniqueName: \"kubernetes.io/projected/69d26602-5297-4c4e-9d41-a08b6676c3a8-kube-api-access-cq6nt\") pod \"69d26602-5297-4c4e-9d41-a08b6676c3a8\" (UID: \"69d26602-5297-4c4e-9d41-a08b6676c3a8\") " May 06 17:19:04.712927 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.712902 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d26602-5297-4c4e-9d41-a08b6676c3a8-kube-api-access-cq6nt" (OuterVolumeSpecName: "kube-api-access-cq6nt") pod "69d26602-5297-4c4e-9d41-a08b6676c3a8" (UID: "69d26602-5297-4c4e-9d41-a08b6676c3a8"). InnerVolumeSpecName "kube-api-access-cq6nt". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:19:04.812611 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.812571 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cq6nt\" (UniqueName: \"kubernetes.io/projected/69d26602-5297-4c4e-9d41-a08b6676c3a8-kube-api-access-cq6nt\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:19:04.891363 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:04.891333 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-gwbpd" May 06 17:19:05.014844 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.014739 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdqtg\" (UniqueName: \"kubernetes.io/projected/3bc9f22e-a152-4bda-ba87-df95fd710c7c-kube-api-access-bdqtg\") pod \"3bc9f22e-a152-4bda-ba87-df95fd710c7c\" (UID: \"3bc9f22e-a152-4bda-ba87-df95fd710c7c\") " May 06 17:19:05.017363 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.017328 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc9f22e-a152-4bda-ba87-df95fd710c7c-kube-api-access-bdqtg" (OuterVolumeSpecName: "kube-api-access-bdqtg") pod "3bc9f22e-a152-4bda-ba87-df95fd710c7c" (UID: "3bc9f22e-a152-4bda-ba87-df95fd710c7c"). InnerVolumeSpecName "kube-api-access-bdqtg". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:19:05.115461 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.115426 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bdqtg\" (UniqueName: \"kubernetes.io/projected/3bc9f22e-a152-4bda-ba87-df95fd710c7c-kube-api-access-bdqtg\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:19:05.595076 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.595037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-67b5fd54c8-ppr72" event={"ID":"2b945d45-0a1f-46d3-a1a2-9c207d9d0710","Type":"ContainerStarted","Data":"bf08087bc4934ef81de11c9f2d9e0f0dc72ea1fac3ffa80c2d926555a0cf18a8"} May 06 17:19:05.596125 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.596097 2578 generic.go:358] "Generic (PLEG): container finished" podID="3bc9f22e-a152-4bda-ba87-df95fd710c7c" containerID="72fff49d8bdaabb84722b8a2e8a618f43d20424a4c6217031086f96365d50358" exitCode=0 May 06 17:19:05.596219 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.596139 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-gwbpd" May 06 17:19:05.596219 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.596177 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-gwbpd" event={"ID":"3bc9f22e-a152-4bda-ba87-df95fd710c7c","Type":"ContainerDied","Data":"72fff49d8bdaabb84722b8a2e8a618f43d20424a4c6217031086f96365d50358"} May 06 17:19:05.596334 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.596218 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-gwbpd" event={"ID":"3bc9f22e-a152-4bda-ba87-df95fd710c7c","Type":"ContainerDied","Data":"bac0ed1e0be25cb7e392643d36ccfadccc7d88313195e4320dbd76d80638ad4b"} May 06 17:19:05.596334 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.596241 2578 scope.go:117] "RemoveContainer" containerID="72fff49d8bdaabb84722b8a2e8a618f43d20424a4c6217031086f96365d50358" May 06 17:19:05.596334 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.596249 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-67864dc994-flqfq" May 06 17:19:05.604722 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.604705 2578 scope.go:117] "RemoveContainer" containerID="72fff49d8bdaabb84722b8a2e8a618f43d20424a4c6217031086f96365d50358" May 06 17:19:05.604948 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:19:05.604931 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72fff49d8bdaabb84722b8a2e8a618f43d20424a4c6217031086f96365d50358\": container with ID starting with 72fff49d8bdaabb84722b8a2e8a618f43d20424a4c6217031086f96365d50358 not found: ID does not exist" containerID="72fff49d8bdaabb84722b8a2e8a618f43d20424a4c6217031086f96365d50358" May 06 17:19:05.604985 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.604955 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72fff49d8bdaabb84722b8a2e8a618f43d20424a4c6217031086f96365d50358"} err="failed to get container status \"72fff49d8bdaabb84722b8a2e8a618f43d20424a4c6217031086f96365d50358\": rpc error: code = NotFound desc = could not find container \"72fff49d8bdaabb84722b8a2e8a618f43d20424a4c6217031086f96365d50358\": container with ID starting with 72fff49d8bdaabb84722b8a2e8a618f43d20424a4c6217031086f96365d50358 not found: ID does not exist" May 06 17:19:05.613091 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.613056 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-67b5fd54c8-ppr72" podStartSLOduration=2.231062629 podStartE2EDuration="2.613045619s" podCreationTimestamp="2026-05-06 17:19:03 +0000 UTC" firstStartedPulling="2026-05-06 17:19:04.412085986 +0000 UTC m=+518.053483563" lastFinishedPulling="2026-05-06 17:19:04.794068977 +0000 UTC m=+518.435466553" observedRunningTime="2026-05-06 17:19:05.611362398 +0000 UTC m=+519.252759998" watchObservedRunningTime="2026-05-06 17:19:05.613045619 +0000 UTC m=+519.254443221" May 06 17:19:05.636407 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.636380 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-67864dc994-flqfq"] May 06 17:19:05.642292 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.642266 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-67864dc994-flqfq"] May 06 17:19:05.657190 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.657166 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-gwbpd"] May 06 17:19:05.659458 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:05.659438 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-gwbpd"] May 06 17:19:06.885442 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:06.885408 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc9f22e-a152-4bda-ba87-df95fd710c7c" path="/var/lib/kubelet/pods/3bc9f22e-a152-4bda-ba87-df95fd710c7c/volumes" May 06 17:19:06.885887 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:19:06.885757 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69d26602-5297-4c4e-9d41-a08b6676c3a8" path="/var/lib/kubelet/pods/69d26602-5297-4c4e-9d41-a08b6676c3a8/volumes" May 06 17:30:00.151329 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.151237 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29634810-gvgfc"] May 06 17:30:00.151929 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.151769 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bc9f22e-a152-4bda-ba87-df95fd710c7c" containerName="authorino" May 06 17:30:00.151929 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.151790 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc9f22e-a152-4bda-ba87-df95fd710c7c" containerName="authorino" May 06 17:30:00.151929 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.151884 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bc9f22e-a152-4bda-ba87-df95fd710c7c" containerName="authorino" May 06 17:30:00.154834 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.154813 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" May 06 17:30:00.157620 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.157603 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-7h8kw\"" May 06 17:30:00.172767 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.172739 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29634810-gvgfc"] May 06 17:30:00.279878 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.279845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t224\" (UniqueName: \"kubernetes.io/projected/f6641fb2-75a9-415e-ad5d-ca868481fc80-kube-api-access-6t224\") pod \"maas-api-key-cleanup-29634810-gvgfc\" (UID: \"f6641fb2-75a9-415e-ad5d-ca868481fc80\") " pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" May 06 17:30:00.381284 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.381242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6t224\" (UniqueName: \"kubernetes.io/projected/f6641fb2-75a9-415e-ad5d-ca868481fc80-kube-api-access-6t224\") pod \"maas-api-key-cleanup-29634810-gvgfc\" (UID: \"f6641fb2-75a9-415e-ad5d-ca868481fc80\") " pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" May 06 17:30:00.390340 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.390310 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t224\" (UniqueName: \"kubernetes.io/projected/f6641fb2-75a9-415e-ad5d-ca868481fc80-kube-api-access-6t224\") pod \"maas-api-key-cleanup-29634810-gvgfc\" (UID: \"f6641fb2-75a9-415e-ad5d-ca868481fc80\") " pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" May 06 17:30:00.464464 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.464436 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" May 06 17:30:00.588638 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.588609 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29634810-gvgfc"] May 06 17:30:00.590378 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:30:00.590352 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6641fb2_75a9_415e_ad5d_ca868481fc80.slice/crio-2494ad6a1c8e365af8e0762ff57f91fbf5448510d351f80b1ed7324eba55458a WatchSource:0}: Error finding container 2494ad6a1c8e365af8e0762ff57f91fbf5448510d351f80b1ed7324eba55458a: Status 404 returned error can't find the container with id 2494ad6a1c8e365af8e0762ff57f91fbf5448510d351f80b1ed7324eba55458a May 06 17:30:00.591990 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.591972 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 06 17:30:00.701569 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:00.701527 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" event={"ID":"f6641fb2-75a9-415e-ad5d-ca868481fc80","Type":"ContainerStarted","Data":"2494ad6a1c8e365af8e0762ff57f91fbf5448510d351f80b1ed7324eba55458a"} May 06 17:30:02.711513 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:02.711482 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" event={"ID":"f6641fb2-75a9-415e-ad5d-ca868481fc80","Type":"ContainerStarted","Data":"780f34730c9f666f336d3a0f2da87a38035029623b505624cec91535c68b6fbd"} May 06 17:30:02.727483 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:02.727437 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" podStartSLOduration=0.961575763 podStartE2EDuration="2.72742238s" podCreationTimestamp="2026-05-06 17:30:00 +0000 UTC" firstStartedPulling="2026-05-06 17:30:00.592121819 +0000 UTC m=+1174.233519395" lastFinishedPulling="2026-05-06 17:30:02.357968416 +0000 UTC m=+1175.999366012" observedRunningTime="2026-05-06 17:30:02.727040952 +0000 UTC m=+1176.368438551" watchObservedRunningTime="2026-05-06 17:30:02.72742238 +0000 UTC m=+1176.368819978" May 06 17:30:03.716176 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:03.716144 2578 generic.go:358] "Generic (PLEG): container finished" podID="f6641fb2-75a9-415e-ad5d-ca868481fc80" containerID="780f34730c9f666f336d3a0f2da87a38035029623b505624cec91535c68b6fbd" exitCode=7 May 06 17:30:03.716565 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:03.716235 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" event={"ID":"f6641fb2-75a9-415e-ad5d-ca868481fc80","Type":"ContainerDied","Data":"780f34730c9f666f336d3a0f2da87a38035029623b505624cec91535c68b6fbd"} May 06 17:30:03.716565 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:03.716385 2578 scope.go:117] "RemoveContainer" containerID="780f34730c9f666f336d3a0f2da87a38035029623b505624cec91535c68b6fbd" May 06 17:30:04.721202 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:04.721109 2578 generic.go:358] "Generic (PLEG): container finished" podID="f6641fb2-75a9-415e-ad5d-ca868481fc80" containerID="da8aedf754986ea89893bd8e64f5b3671d9387eca58227d88884b22d4d0ed954" exitCode=7 May 06 17:30:04.721202 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:04.721178 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" event={"ID":"f6641fb2-75a9-415e-ad5d-ca868481fc80","Type":"ContainerDied","Data":"da8aedf754986ea89893bd8e64f5b3671d9387eca58227d88884b22d4d0ed954"} May 06 17:30:04.721746 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:04.721209 2578 scope.go:117] "RemoveContainer" containerID="780f34730c9f666f336d3a0f2da87a38035029623b505624cec91535c68b6fbd" May 06 17:30:04.721746 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:04.721454 2578 scope.go:117] "RemoveContainer" containerID="da8aedf754986ea89893bd8e64f5b3671d9387eca58227d88884b22d4d0ed954" May 06 17:30:04.721746 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:30:04.721721 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29634810-gvgfc_opendatahub(f6641fb2-75a9-415e-ad5d-ca868481fc80)\"" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" podUID="f6641fb2-75a9-415e-ad5d-ca868481fc80" May 06 17:30:05.726529 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:05.726502 2578 scope.go:117] "RemoveContainer" containerID="da8aedf754986ea89893bd8e64f5b3671d9387eca58227d88884b22d4d0ed954" May 06 17:30:05.727036 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:30:05.726708 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29634810-gvgfc_opendatahub(f6641fb2-75a9-415e-ad5d-ca868481fc80)\"" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" podUID="f6641fb2-75a9-415e-ad5d-ca868481fc80" May 06 17:30:19.880067 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:19.880036 2578 scope.go:117] "RemoveContainer" containerID="da8aedf754986ea89893bd8e64f5b3671d9387eca58227d88884b22d4d0ed954" May 06 17:30:20.778441 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:20.778348 2578 generic.go:358] "Generic (PLEG): container finished" podID="f6641fb2-75a9-415e-ad5d-ca868481fc80" containerID="228a7c01d751276c892b85f7cd0dde61909d2e71c82d1b37291566ffb29ffe19" exitCode=7 May 06 17:30:20.778632 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:20.778431 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" event={"ID":"f6641fb2-75a9-415e-ad5d-ca868481fc80","Type":"ContainerDied","Data":"228a7c01d751276c892b85f7cd0dde61909d2e71c82d1b37291566ffb29ffe19"} May 06 17:30:20.778632 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:20.778489 2578 scope.go:117] "RemoveContainer" containerID="da8aedf754986ea89893bd8e64f5b3671d9387eca58227d88884b22d4d0ed954" May 06 17:30:20.778820 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:20.778804 2578 scope.go:117] "RemoveContainer" containerID="228a7c01d751276c892b85f7cd0dde61909d2e71c82d1b37291566ffb29ffe19" May 06 17:30:20.779050 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:30:20.779017 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cleanup pod=maas-api-key-cleanup-29634810-gvgfc_opendatahub(f6641fb2-75a9-415e-ad5d-ca868481fc80)\"" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" podUID="f6641fb2-75a9-415e-ad5d-ca868481fc80" May 06 17:30:21.803760 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:21.803728 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29634810-gvgfc"] May 06 17:30:21.930397 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:21.930376 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" May 06 17:30:21.950538 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:21.950511 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t224\" (UniqueName: \"kubernetes.io/projected/f6641fb2-75a9-415e-ad5d-ca868481fc80-kube-api-access-6t224\") pod \"f6641fb2-75a9-415e-ad5d-ca868481fc80\" (UID: \"f6641fb2-75a9-415e-ad5d-ca868481fc80\") " May 06 17:30:21.953384 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:21.953358 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6641fb2-75a9-415e-ad5d-ca868481fc80-kube-api-access-6t224" (OuterVolumeSpecName: "kube-api-access-6t224") pod "f6641fb2-75a9-415e-ad5d-ca868481fc80" (UID: "f6641fb2-75a9-415e-ad5d-ca868481fc80"). InnerVolumeSpecName "kube-api-access-6t224". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:30:22.051742 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:22.051706 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6t224\" (UniqueName: \"kubernetes.io/projected/f6641fb2-75a9-415e-ad5d-ca868481fc80-kube-api-access-6t224\") on node \"ip-10-0-131-115.ec2.internal\" DevicePath \"\"" May 06 17:30:22.787119 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:22.787083 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" event={"ID":"f6641fb2-75a9-415e-ad5d-ca868481fc80","Type":"ContainerDied","Data":"2494ad6a1c8e365af8e0762ff57f91fbf5448510d351f80b1ed7324eba55458a"} May 06 17:30:22.787301 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:22.787131 2578 scope.go:117] "RemoveContainer" containerID="228a7c01d751276c892b85f7cd0dde61909d2e71c82d1b37291566ffb29ffe19" May 06 17:30:22.787301 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:22.787101 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29634810-gvgfc" May 06 17:30:22.809457 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:22.809428 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29634810-gvgfc"] May 06 17:30:22.811055 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:22.811034 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29634810-gvgfc"] May 06 17:30:22.884721 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:30:22.884688 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6641fb2-75a9-415e-ad5d-ca868481fc80" path="/var/lib/kubelet/pods/f6641fb2-75a9-415e-ad5d-ca868481fc80/volumes" May 06 17:40:10.586509 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.586477 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp"] May 06 17:40:10.589085 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.586993 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6641fb2-75a9-415e-ad5d-ca868481fc80" containerName="cleanup" May 06 17:40:10.589085 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.587011 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6641fb2-75a9-415e-ad5d-ca868481fc80" containerName="cleanup" May 06 17:40:10.589085 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.587033 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6641fb2-75a9-415e-ad5d-ca868481fc80" containerName="cleanup" May 06 17:40:10.589085 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.587039 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6641fb2-75a9-415e-ad5d-ca868481fc80" containerName="cleanup" May 06 17:40:10.589085 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.587090 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6641fb2-75a9-415e-ad5d-ca868481fc80" containerName="cleanup" May 06 17:40:10.589085 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.587097 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6641fb2-75a9-415e-ad5d-ca868481fc80" containerName="cleanup" May 06 17:40:10.589085 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.587104 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6641fb2-75a9-415e-ad5d-ca868481fc80" containerName="cleanup" May 06 17:40:10.589085 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.587152 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6641fb2-75a9-415e-ad5d-ca868481fc80" containerName="cleanup" May 06 17:40:10.589085 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.587157 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6641fb2-75a9-415e-ad5d-ca868481fc80" containerName="cleanup" May 06 17:40:10.590015 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.589994 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.596686 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.596388 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" May 06 17:40:10.596686 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.596398 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" May 06 17:40:10.596686 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.596463 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" May 06 17:40:10.596843 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.596735 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-mb796\"" May 06 17:40:10.602259 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.602238 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp"] May 06 17:40:10.732524 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.732489 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml4nm\" (UniqueName: \"kubernetes.io/projected/df812e4b-1ed9-487f-967e-ab638396a227-kube-api-access-ml4nm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.732713 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.732546 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df812e4b-1ed9-487f-967e-ab638396a227-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.732713 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.732616 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/df812e4b-1ed9-487f-967e-ab638396a227-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.732713 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.732638 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df812e4b-1ed9-487f-967e-ab638396a227-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.732713 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.732673 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/df812e4b-1ed9-487f-967e-ab638396a227-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.732851 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.732752 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/df812e4b-1ed9-487f-967e-ab638396a227-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.833701 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.833667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df812e4b-1ed9-487f-967e-ab638396a227-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.833875 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.833716 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/df812e4b-1ed9-487f-967e-ab638396a227-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.833875 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.833742 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/df812e4b-1ed9-487f-967e-ab638396a227-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.833875 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.833777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml4nm\" (UniqueName: \"kubernetes.io/projected/df812e4b-1ed9-487f-967e-ab638396a227-kube-api-access-ml4nm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.833875 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.833815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df812e4b-1ed9-487f-967e-ab638396a227-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.833875 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.833844 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/df812e4b-1ed9-487f-967e-ab638396a227-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.834149 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.834125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df812e4b-1ed9-487f-967e-ab638396a227-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.834197 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.834177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/df812e4b-1ed9-487f-967e-ab638396a227-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.834244 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.834223 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/df812e4b-1ed9-487f-967e-ab638396a227-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.836119 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.836095 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/df812e4b-1ed9-487f-967e-ab638396a227-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.836269 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.836254 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df812e4b-1ed9-487f-967e-ab638396a227-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.847789 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.847726 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml4nm\" (UniqueName: \"kubernetes.io/projected/df812e4b-1ed9-487f-967e-ab638396a227-kube-api-access-ml4nm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wffqp\" (UID: \"df812e4b-1ed9-487f-967e-ab638396a227\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:10.900691 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:10.900660 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:11.040290 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:11.040126 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp"] May 06 17:40:11.046799 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:40:11.046771 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf812e4b_1ed9_487f_967e_ab638396a227.slice/crio-1058166fbdbb222544bef2f883814e5e8405eb48273a0259400dfdf90814d312 WatchSource:0}: Error finding container 1058166fbdbb222544bef2f883814e5e8405eb48273a0259400dfdf90814d312: Status 404 returned error can't find the container with id 1058166fbdbb222544bef2f883814e5e8405eb48273a0259400dfdf90814d312 May 06 17:40:11.048554 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:11.048536 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 06 17:40:11.725992 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:11.725957 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" event={"ID":"df812e4b-1ed9-487f-967e-ab638396a227","Type":"ContainerStarted","Data":"1058166fbdbb222544bef2f883814e5e8405eb48273a0259400dfdf90814d312"} May 06 17:40:16.746441 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:16.746407 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" event={"ID":"df812e4b-1ed9-487f-967e-ab638396a227","Type":"ContainerStarted","Data":"a348bb649fedbddeb0763c12460aebe944cb98c2d87a4bbb0b5070adbad442b6"} May 06 17:40:21.907477 ip-10-0-131-115 kubenswrapper[2578]: E0506 17:40:21.907449 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf812e4b_1ed9_487f_967e_ab638396a227.slice/crio-a348bb649fedbddeb0763c12460aebe944cb98c2d87a4bbb0b5070adbad442b6.scope\": RecentStats: unable to find data in memory cache]" May 06 17:40:22.767308 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:22.767269 2578 generic.go:358] "Generic (PLEG): container finished" podID="df812e4b-1ed9-487f-967e-ab638396a227" containerID="a348bb649fedbddeb0763c12460aebe944cb98c2d87a4bbb0b5070adbad442b6" exitCode=0 May 06 17:40:22.767474 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:22.767335 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" event={"ID":"df812e4b-1ed9-487f-967e-ab638396a227","Type":"ContainerDied","Data":"a348bb649fedbddeb0763c12460aebe944cb98c2d87a4bbb0b5070adbad442b6"} May 06 17:40:24.775674 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:24.775641 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" event={"ID":"df812e4b-1ed9-487f-967e-ab638396a227","Type":"ContainerStarted","Data":"3747fb23886b3dcc4be0270e35057232762cb751a1b307fb1c0989024de2da61"} May 06 17:40:24.776089 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:24.775879 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:24.799912 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:24.799853 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" podStartSLOduration=1.7436545780000001 podStartE2EDuration="14.799836201s" podCreationTimestamp="2026-05-06 17:40:10 +0000 UTC" firstStartedPulling="2026-05-06 17:40:11.04871252 +0000 UTC m=+1784.690110097" lastFinishedPulling="2026-05-06 17:40:24.104894145 +0000 UTC m=+1797.746291720" observedRunningTime="2026-05-06 17:40:24.797497003 +0000 UTC m=+1798.438894625" watchObservedRunningTime="2026-05-06 17:40:24.799836201 +0000 UTC m=+1798.441233805" May 06 17:40:35.792167 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:35.792135 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wffqp" May 06 17:40:37.509224 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.509188 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb"] May 06 17:40:37.513536 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.513519 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.516276 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.516253 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" May 06 17:40:37.523019 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.522995 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb"] May 06 17:40:37.569166 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.569141 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/203dab7c-0e97-47ae-8926-2a9b70e1bd41-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.569332 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.569170 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blf6t\" (UniqueName: \"kubernetes.io/projected/203dab7c-0e97-47ae-8926-2a9b70e1bd41-kube-api-access-blf6t\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.569332 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.569189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/203dab7c-0e97-47ae-8926-2a9b70e1bd41-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.569332 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.569295 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/203dab7c-0e97-47ae-8926-2a9b70e1bd41-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.569495 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.569331 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/203dab7c-0e97-47ae-8926-2a9b70e1bd41-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.569495 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.569386 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/203dab7c-0e97-47ae-8926-2a9b70e1bd41-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.670839 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.670801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/203dab7c-0e97-47ae-8926-2a9b70e1bd41-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.671000 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.670854 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/203dab7c-0e97-47ae-8926-2a9b70e1bd41-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.671000 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.670911 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/203dab7c-0e97-47ae-8926-2a9b70e1bd41-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.671000 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.670935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blf6t\" (UniqueName: \"kubernetes.io/projected/203dab7c-0e97-47ae-8926-2a9b70e1bd41-kube-api-access-blf6t\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.671000 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.670960 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/203dab7c-0e97-47ae-8926-2a9b70e1bd41-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.671187 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.671035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/203dab7c-0e97-47ae-8926-2a9b70e1bd41-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.671252 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.671231 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/203dab7c-0e97-47ae-8926-2a9b70e1bd41-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.671310 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.671291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/203dab7c-0e97-47ae-8926-2a9b70e1bd41-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.671403 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.671385 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/203dab7c-0e97-47ae-8926-2a9b70e1bd41-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.673081 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.673062 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/203dab7c-0e97-47ae-8926-2a9b70e1bd41-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.673347 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.673331 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/203dab7c-0e97-47ae-8926-2a9b70e1bd41-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.679063 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.679041 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blf6t\" (UniqueName: \"kubernetes.io/projected/203dab7c-0e97-47ae-8926-2a9b70e1bd41-kube-api-access-blf6t\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb\" (UID: \"203dab7c-0e97-47ae-8926-2a9b70e1bd41\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.824913 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.824815 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:37.955121 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:37.955069 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb"] May 06 17:40:37.957339 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:40:37.957311 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod203dab7c_0e97_47ae_8926_2a9b70e1bd41.slice/crio-d0948a6f2bf440cde5989666223f84593c2022c0be6065d145d1ad1ed813e87e WatchSource:0}: Error finding container d0948a6f2bf440cde5989666223f84593c2022c0be6065d145d1ad1ed813e87e: Status 404 returned error can't find the container with id d0948a6f2bf440cde5989666223f84593c2022c0be6065d145d1ad1ed813e87e May 06 17:40:38.177035 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.177004 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6"] May 06 17:40:38.180808 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.180789 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.183974 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.183955 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" May 06 17:40:38.193302 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.193279 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6"] May 06 17:40:38.277266 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.277216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d689d1bd-c693-42cc-875b-5387a78ede9d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.277446 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.277308 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d689d1bd-c693-42cc-875b-5387a78ede9d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.277446 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.277359 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4m5h\" (UniqueName: \"kubernetes.io/projected/d689d1bd-c693-42cc-875b-5387a78ede9d-kube-api-access-n4m5h\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.277446 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.277441 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d689d1bd-c693-42cc-875b-5387a78ede9d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.277636 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.277469 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d689d1bd-c693-42cc-875b-5387a78ede9d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.277636 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.277514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d689d1bd-c693-42cc-875b-5387a78ede9d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.378484 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.378447 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d689d1bd-c693-42cc-875b-5387a78ede9d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.378657 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.378527 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d689d1bd-c693-42cc-875b-5387a78ede9d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.378657 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.378564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d689d1bd-c693-42cc-875b-5387a78ede9d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.378657 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.378625 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4m5h\" (UniqueName: \"kubernetes.io/projected/d689d1bd-c693-42cc-875b-5387a78ede9d-kube-api-access-n4m5h\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.378826 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.378685 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d689d1bd-c693-42cc-875b-5387a78ede9d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.378826 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.378715 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d689d1bd-c693-42cc-875b-5387a78ede9d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.378964 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.378937 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d689d1bd-c693-42cc-875b-5387a78ede9d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.379040 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.379011 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d689d1bd-c693-42cc-875b-5387a78ede9d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.379131 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.379105 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d689d1bd-c693-42cc-875b-5387a78ede9d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.380803 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.380780 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d689d1bd-c693-42cc-875b-5387a78ede9d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.380898 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.380887 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d689d1bd-c693-42cc-875b-5387a78ede9d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.391636 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.391618 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4m5h\" (UniqueName: \"kubernetes.io/projected/d689d1bd-c693-42cc-875b-5387a78ede9d-kube-api-access-n4m5h\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6\" (UID: \"d689d1bd-c693-42cc-875b-5387a78ede9d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.491881 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.491801 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:38.628320 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.628285 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6"] May 06 17:40:38.631327 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:40:38.631279 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd689d1bd_c693_42cc_875b_5387a78ede9d.slice/crio-7cdb8f78461be5ef4eb98dd933dee4b51d666ad94f7e9d7ea41b076623d05a22 WatchSource:0}: Error finding container 7cdb8f78461be5ef4eb98dd933dee4b51d666ad94f7e9d7ea41b076623d05a22: Status 404 returned error can't find the container with id 7cdb8f78461be5ef4eb98dd933dee4b51d666ad94f7e9d7ea41b076623d05a22 May 06 17:40:38.825553 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.825454 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" event={"ID":"d689d1bd-c693-42cc-875b-5387a78ede9d","Type":"ContainerStarted","Data":"c481938b7bec009b1aab19176e7047b7a2d2bc84ae2d48e2d6bb4be0b82075b6"} May 06 17:40:38.825553 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.825499 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" event={"ID":"d689d1bd-c693-42cc-875b-5387a78ede9d","Type":"ContainerStarted","Data":"7cdb8f78461be5ef4eb98dd933dee4b51d666ad94f7e9d7ea41b076623d05a22"} May 06 17:40:38.827045 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.827010 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" event={"ID":"203dab7c-0e97-47ae-8926-2a9b70e1bd41","Type":"ContainerStarted","Data":"0e37500e3f73ea0606b4021747ab43fd82214f088989bc46d03ad025407d3824"} May 06 17:40:38.827161 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:38.827052 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" event={"ID":"203dab7c-0e97-47ae-8926-2a9b70e1bd41","Type":"ContainerStarted","Data":"d0948a6f2bf440cde5989666223f84593c2022c0be6065d145d1ad1ed813e87e"} May 06 17:40:43.848939 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:43.848843 2578 generic.go:358] "Generic (PLEG): container finished" podID="203dab7c-0e97-47ae-8926-2a9b70e1bd41" containerID="0e37500e3f73ea0606b4021747ab43fd82214f088989bc46d03ad025407d3824" exitCode=0 May 06 17:40:43.848939 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:43.848917 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" event={"ID":"203dab7c-0e97-47ae-8926-2a9b70e1bd41","Type":"ContainerDied","Data":"0e37500e3f73ea0606b4021747ab43fd82214f088989bc46d03ad025407d3824"} May 06 17:40:44.853502 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:44.853464 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" event={"ID":"203dab7c-0e97-47ae-8926-2a9b70e1bd41","Type":"ContainerStarted","Data":"ebe25cc73b45841bc3c425a01048e33cf4fce4f32ddc8e72a85683cf5afa8c4b"} May 06 17:40:44.853950 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:44.853698 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:44.854769 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:44.854750 2578 generic.go:358] "Generic (PLEG): container finished" podID="d689d1bd-c693-42cc-875b-5387a78ede9d" containerID="c481938b7bec009b1aab19176e7047b7a2d2bc84ae2d48e2d6bb4be0b82075b6" exitCode=0 May 06 17:40:44.854820 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:44.854812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" event={"ID":"d689d1bd-c693-42cc-875b-5387a78ede9d","Type":"ContainerDied","Data":"c481938b7bec009b1aab19176e7047b7a2d2bc84ae2d48e2d6bb4be0b82075b6"} May 06 17:40:44.883981 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:44.883923 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" podStartSLOduration=7.650584261 podStartE2EDuration="7.883905534s" podCreationTimestamp="2026-05-06 17:40:37 +0000 UTC" firstStartedPulling="2026-05-06 17:40:43.849550177 +0000 UTC m=+1817.490947753" lastFinishedPulling="2026-05-06 17:40:44.082871438 +0000 UTC m=+1817.724269026" observedRunningTime="2026-05-06 17:40:44.881410756 +0000 UTC m=+1818.522808359" watchObservedRunningTime="2026-05-06 17:40:44.883905534 +0000 UTC m=+1818.525303134" May 06 17:40:45.859994 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:45.859956 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" event={"ID":"d689d1bd-c693-42cc-875b-5387a78ede9d","Type":"ContainerStarted","Data":"521734c091f05cef615760506bef7e460fbe13752bd455badddbda4d53782fa8"} May 06 17:40:45.860428 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:45.860272 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:40:45.883219 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:45.883173 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" podStartSLOduration=7.669198107 podStartE2EDuration="7.883159774s" podCreationTimestamp="2026-05-06 17:40:38 +0000 UTC" firstStartedPulling="2026-05-06 17:40:44.8553304 +0000 UTC m=+1818.496727975" lastFinishedPulling="2026-05-06 17:40:45.069292059 +0000 UTC m=+1818.710689642" observedRunningTime="2026-05-06 17:40:45.882286677 +0000 UTC m=+1819.523684287" watchObservedRunningTime="2026-05-06 17:40:45.883159774 +0000 UTC m=+1819.524557372" May 06 17:40:55.872910 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:55.872882 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb" May 06 17:40:56.876020 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:40:56.875986 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6" May 06 17:46:35.941532 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:35.941498 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-67b5fd54c8-ppr72_2b945d45-0a1f-46d3-a1a2-9c207d9d0710/authorino/0.log" May 06 17:46:41.243377 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:41.243347 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-698574c4f-98m5v_04562d63-745b-45e8-9842-354b822ed447/manager/0.log" May 06 17:46:42.905291 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:42.905251 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-67b5fd54c8-ppr72_2b945d45-0a1f-46d3-a1a2-9c207d9d0710/authorino/0.log" May 06 17:46:43.497311 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:43.497280 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-6q9bw_4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6/manager/0.log" May 06 17:46:44.338309 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:44.338278 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-69f8cf9d8c-4vzkv_9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5/kube-auth-proxy/0.log" May 06 17:46:45.194320 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:45.194293 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb_203dab7c-0e97-47ae-8926-2a9b70e1bd41/storage-initializer/0.log" May 06 17:46:45.201098 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:45.201067 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-gclxb_203dab7c-0e97-47ae-8926-2a9b70e1bd41/main/0.log" May 06 17:46:45.313575 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:45.313545 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-wffqp_df812e4b-1ed9-487f-967e-ab638396a227/storage-initializer/0.log" May 06 17:46:45.320264 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:45.320246 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-wffqp_df812e4b-1ed9-487f-967e-ab638396a227/main/0.log" May 06 17:46:45.439242 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:45.439219 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6_d689d1bd-c693-42cc-875b-5387a78ede9d/storage-initializer/0.log" May 06 17:46:45.447173 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:45.447113 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc52fv6_d689d1bd-c693-42cc-875b-5387a78ede9d/main/0.log" May 06 17:46:57.994677 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:57.994637 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wvb5c_8a7c9e10-c274-46eb-b020-deee09868a53/global-pull-secret-syncer/0.log" May 06 17:46:58.079388 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:58.079356 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mtlxr_c7ac35c9-4cf5-489f-9fd7-4e950edc1678/konnectivity-agent/0.log" May 06 17:46:58.163067 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:46:58.163042 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-115.ec2.internal_1ee67fdc31db59363fd513d0a4d3384a/haproxy/0.log" May 06 17:47:02.274775 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:02.274744 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-67b5fd54c8-ppr72_2b945d45-0a1f-46d3-a1a2-9c207d9d0710/authorino/0.log" May 06 17:47:02.439607 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:02.439557 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-6q9bw_4f8a3fae-4bf9-44f4-b3a2-a33bf87984f6/manager/0.log" May 06 17:47:03.652646 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:03.652620 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d26ad6c1-538b-46c9-beb2-48279a7382cd/alertmanager/0.log" May 06 17:47:03.674379 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:03.674354 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d26ad6c1-538b-46c9-beb2-48279a7382cd/config-reloader/0.log" May 06 17:47:03.704629 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:03.704603 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d26ad6c1-538b-46c9-beb2-48279a7382cd/kube-rbac-proxy-web/0.log" May 06 17:47:03.730322 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:03.730294 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d26ad6c1-538b-46c9-beb2-48279a7382cd/kube-rbac-proxy/0.log" May 06 17:47:03.752786 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:03.752767 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d26ad6c1-538b-46c9-beb2-48279a7382cd/kube-rbac-proxy-metric/0.log" May 06 17:47:03.776074 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:03.776054 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d26ad6c1-538b-46c9-beb2-48279a7382cd/prom-label-proxy/0.log" May 06 17:47:03.801027 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:03.801006 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d26ad6c1-538b-46c9-beb2-48279a7382cd/init-config-reloader/0.log" May 06 17:47:03.951764 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:03.951741 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-667d5c986-57xjw_5ddabe88-c951-48de-9c9d-61a43cde5935/metrics-server/0.log" May 06 17:47:03.978266 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:03.978240 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-655d88fc6c-6fkl6_a55e3b09-195e-4c75-9243-b827d82be012/monitoring-plugin/0.log" May 06 17:47:04.092631 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.092577 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h7dtw_4a629322-a916-4ef9-a2a3-40536587a62d/node-exporter/0.log" May 06 17:47:04.116250 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.116225 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h7dtw_4a629322-a916-4ef9-a2a3-40536587a62d/kube-rbac-proxy/0.log" May 06 17:47:04.139105 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.139044 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h7dtw_4a629322-a916-4ef9-a2a3-40536587a62d/init-textfile/0.log" May 06 17:47:04.275463 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.275438 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5cc99f7c99-rzxd8_70a32e35-d08a-40c8-a501-a2edd6df74e6/kube-rbac-proxy-main/0.log" May 06 17:47:04.298420 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.298397 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5cc99f7c99-rzxd8_70a32e35-d08a-40c8-a501-a2edd6df74e6/kube-rbac-proxy-self/0.log" May 06 17:47:04.322168 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.322149 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5cc99f7c99-rzxd8_70a32e35-d08a-40c8-a501-a2edd6df74e6/openshift-state-metrics/0.log" May 06 17:47:04.364561 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.364542 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_623dd84d-9fb6-4d35-bd9d-8202b9017be2/prometheus/0.log" May 06 17:47:04.391140 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.391074 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_623dd84d-9fb6-4d35-bd9d-8202b9017be2/config-reloader/0.log" May 06 17:47:04.412191 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.412166 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_623dd84d-9fb6-4d35-bd9d-8202b9017be2/thanos-sidecar/0.log" May 06 17:47:04.434267 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.434247 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_623dd84d-9fb6-4d35-bd9d-8202b9017be2/kube-rbac-proxy-web/0.log" May 06 17:47:04.457870 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.457849 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_623dd84d-9fb6-4d35-bd9d-8202b9017be2/kube-rbac-proxy/0.log" May 06 17:47:04.479497 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.479478 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_623dd84d-9fb6-4d35-bd9d-8202b9017be2/kube-rbac-proxy-thanos/0.log" May 06 17:47:04.504828 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.504812 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_623dd84d-9fb6-4d35-bd9d-8202b9017be2/init-config-reloader/0.log" May 06 17:47:04.596725 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.596702 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-64b84d7657-gmq9q_b853761b-55f0-41f4-a922-c8f1d5fdfc58/prometheus-operator-admission-webhook/0.log" May 06 17:47:04.626520 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.626489 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b669d585c-7xqfk_0914baa9-2745-4b38-856b-bcc60beae5d4/telemeter-client/0.log" May 06 17:47:04.648531 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.648473 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b669d585c-7xqfk_0914baa9-2745-4b38-856b-bcc60beae5d4/reload/0.log" May 06 17:47:04.672256 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:04.672237 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b669d585c-7xqfk_0914baa9-2745-4b38-856b-bcc60beae5d4/kube-rbac-proxy/0.log" May 06 17:47:06.332665 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.332631 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc"] May 06 17:47:06.336020 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.336004 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.338534 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.338512 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xnzz7\"/\"kube-root-ca.crt\"" May 06 17:47:06.338672 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.338605 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xnzz7\"/\"default-dockercfg-crpl7\"" May 06 17:47:06.338672 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.338624 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xnzz7\"/\"openshift-service-ca.crt\"" May 06 17:47:06.344713 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.344694 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc"] May 06 17:47:06.375499 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.375478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4418250-992e-43f0-9cb6-282613d725af-lib-modules\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.375618 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.375511 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e4418250-992e-43f0-9cb6-282613d725af-podres\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.375618 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.375542 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj7bd\" (UniqueName: \"kubernetes.io/projected/e4418250-992e-43f0-9cb6-282613d725af-kube-api-access-mj7bd\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.375705 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.375664 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4418250-992e-43f0-9cb6-282613d725af-sys\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.375705 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.375694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e4418250-992e-43f0-9cb6-282613d725af-proc\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.476797 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.476773 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e4418250-992e-43f0-9cb6-282613d725af-podres\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.476936 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.476814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mj7bd\" (UniqueName: \"kubernetes.io/projected/e4418250-992e-43f0-9cb6-282613d725af-kube-api-access-mj7bd\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.476936 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.476852 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4418250-992e-43f0-9cb6-282613d725af-sys\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.476936 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.476871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e4418250-992e-43f0-9cb6-282613d725af-proc\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.476936 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.476902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4418250-992e-43f0-9cb6-282613d725af-lib-modules\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.477089 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.476952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e4418250-992e-43f0-9cb6-282613d725af-podres\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.477089 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.476969 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e4418250-992e-43f0-9cb6-282613d725af-proc\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.477089 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.476957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4418250-992e-43f0-9cb6-282613d725af-sys\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.477089 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.477051 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4418250-992e-43f0-9cb6-282613d725af-lib-modules\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.485270 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.485251 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj7bd\" (UniqueName: \"kubernetes.io/projected/e4418250-992e-43f0-9cb6-282613d725af-kube-api-access-mj7bd\") pod \"perf-node-gather-daemonset-qqxrc\" (UID: \"e4418250-992e-43f0-9cb6-282613d725af\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.646754 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.646659 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:06.773304 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.773273 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc"] May 06 17:47:06.776995 ip-10-0-131-115 kubenswrapper[2578]: W0506 17:47:06.776969 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode4418250_992e_43f0_9cb6_282613d725af.slice/crio-504c0e3b75a19cef0c7e86c6f37888403c2b8fcf036128a50e78f0ce101e29e1 WatchSource:0}: Error finding container 504c0e3b75a19cef0c7e86c6f37888403c2b8fcf036128a50e78f0ce101e29e1: Status 404 returned error can't find the container with id 504c0e3b75a19cef0c7e86c6f37888403c2b8fcf036128a50e78f0ce101e29e1 May 06 17:47:06.778530 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:06.778515 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 06 17:47:07.020714 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:07.020679 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c64c884c-pbc48_18c36a0e-cbf5-45a5-bb07-25bd57a01f2d/console/0.log" May 06 17:47:07.128866 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:07.128834 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" event={"ID":"e4418250-992e-43f0-9cb6-282613d725af","Type":"ContainerStarted","Data":"aaa71681d3d4d14bfc2cd25c50e39b4aa119c3d4343df88d49dca590848591bb"} May 06 17:47:07.128866 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:07.128869 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" event={"ID":"e4418250-992e-43f0-9cb6-282613d725af","Type":"ContainerStarted","Data":"504c0e3b75a19cef0c7e86c6f37888403c2b8fcf036128a50e78f0ce101e29e1"} May 06 17:47:07.129069 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:07.128891 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:07.146315 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:07.146251 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" podStartSLOduration=1.146235668 podStartE2EDuration="1.146235668s" podCreationTimestamp="2026-05-06 17:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:47:07.145411198 +0000 UTC m=+2200.786808810" watchObservedRunningTime="2026-05-06 17:47:07.146235668 +0000 UTC m=+2200.787633265" May 06 17:47:08.470529 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:08.470502 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-b4hs8_bf278082-faf7-485f-9ac3-52430773540c/dns/0.log" May 06 17:47:08.533171 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:08.533140 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-b4hs8_bf278082-faf7-485f-9ac3-52430773540c/kube-rbac-proxy/0.log" May 06 17:47:08.693162 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:08.693136 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4lbvf_d07b8d8f-4948-4e10-8402-db41f4c64242/dns-node-resolver/0.log" May 06 17:47:09.294770 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:09.294739 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pwg45_9225285c-2bd9-49db-af85-96b0a3f45d5a/node-ca/0.log" May 06 17:47:10.348911 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:10.348881 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-69f8cf9d8c-4vzkv_9f53b9e5-5c24-4c3d-a6f0-6c1a66e9b1d5/kube-auth-proxy/0.log" May 06 17:47:10.998207 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:10.998174 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-d774v_40f1eb9d-6ab0-426e-aee9-f57d3f7a16ed/serve-healthcheck-canary/0.log" May 06 17:47:11.614341 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:11.614307 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tgn76_ab661ca8-1bc3-42f1-843e-1b450eaffbfc/kube-rbac-proxy/0.log" May 06 17:47:11.636305 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:11.636281 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tgn76_ab661ca8-1bc3-42f1-843e-1b450eaffbfc/exporter/0.log" May 06 17:47:11.659621 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:11.659603 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tgn76_ab661ca8-1bc3-42f1-843e-1b450eaffbfc/extractor/0.log" May 06 17:47:13.141534 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:13.141509 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-qqxrc" May 06 17:47:13.682750 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:13.682708 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-698574c4f-98m5v_04562d63-745b-45e8-9842-354b822ed447/manager/0.log" May 06 17:47:14.995617 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:14.995562 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7fd7474bdd-99k2r_fc5dde84-ff65-4a2b-8fa2-291004ae61c7/manager/0.log" May 06 17:47:15.052065 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:15.051966 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-fmtz2_4f1c1b72-2fde-4d8c-940e-ef9ad420264e/openshift-lws-operator/0.log" May 06 17:47:19.589881 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:19.589851 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-5f598d4645-z425g_a8efde7e-259d-44b2-8529-28c7ccbbff38/migrator/0.log" May 06 17:47:19.615398 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:19.615374 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-5f598d4645-z425g_a8efde7e-259d-44b2-8529-28c7ccbbff38/graceful-termination/0.log" May 06 17:47:21.412123 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:21.412095 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6k67_ae7c9848-2f33-464c-a9e2-b97ba5c1b57c/kube-multus-additional-cni-plugins/0.log" May 06 17:47:21.434716 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:21.434691 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6k67_ae7c9848-2f33-464c-a9e2-b97ba5c1b57c/egress-router-binary-copy/0.log" May 06 17:47:21.455982 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:21.455962 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6k67_ae7c9848-2f33-464c-a9e2-b97ba5c1b57c/cni-plugins/0.log" May 06 17:47:21.477362 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:21.477342 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6k67_ae7c9848-2f33-464c-a9e2-b97ba5c1b57c/bond-cni-plugin/0.log" May 06 17:47:21.500641 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:21.500622 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6k67_ae7c9848-2f33-464c-a9e2-b97ba5c1b57c/routeoverride-cni/0.log" May 06 17:47:21.522136 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:21.522117 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6k67_ae7c9848-2f33-464c-a9e2-b97ba5c1b57c/whereabouts-cni-bincopy/0.log" May 06 17:47:21.545518 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:21.545491 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6k67_ae7c9848-2f33-464c-a9e2-b97ba5c1b57c/whereabouts-cni/0.log" May 06 17:47:21.582668 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:21.582648 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f7wdz_cbdbc6d6-c384-4259-8a3b-f40e37586a30/kube-multus/0.log" May 06 17:47:21.630408 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:21.630387 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4cnzp_e0f197dc-dd5c-4f23-ad0d-f076fc70415f/network-metrics-daemon/0.log" May 06 17:47:21.652924 ip-10-0-131-115 kubenswrapper[2578]: I0506 17:47:21.652899 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4cnzp_e0f197dc-dd5c-4f23-ad0d-f076fc70415f/kube-rbac-proxy/0.log"