Apr 16 18:02:09.581897 ip-10-0-134-55 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:02:10.003763 ip-10-0-134-55 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:10.003763 ip-10-0-134-55 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:02:10.003763 ip-10-0-134-55 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:10.003763 ip-10-0-134-55 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:02:10.003763 ip-10-0-134-55 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:10.006912 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.006821 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:02:10.009759 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009746 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:10.009759 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009759 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009763 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009766 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009770 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009773 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009776 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009778 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009781 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009784 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009786 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009789 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009791 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009794 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009797 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009809 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009812 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009815 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009817 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009820 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009823 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:10.009822 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009826 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009829 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009832 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009835 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009838 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009841 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009843 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009846 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009848 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009851 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009854 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009856 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009859 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009861 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009864 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009867 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009869 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009872 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009875 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009878 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:10.010386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009880 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009883 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009885 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009888 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009890 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009893 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009896 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009899 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009903 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009906 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009909 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009911 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009914 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009916 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009919 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009922 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009924 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009927 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009929 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:10.011123 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009932 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009935 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009937 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009940 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009943 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009945 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009948 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009950 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009954 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009957 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009959 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009962 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009964 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009967 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009970 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009972 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009975 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009986 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009989 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009991 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:10.011646 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009994 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.009999 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.010003 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.010007 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.010010 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.010013 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011553 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011564 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011567 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011571 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011574 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011578 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011581 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011583 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011586 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011589 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011592 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011594 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011615 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:10.012130 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011619 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011622 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011625 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011627 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011630 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011633 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011636 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011638 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011641 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011643 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011646 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011648 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011651 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011654 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011657 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011660 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011664 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011668 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011671 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011678 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:10.012592 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011681 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011684 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011687 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011689 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011692 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011695 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011697 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011700 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011702 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011705 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011708 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011710 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011713 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011715 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011718 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011720 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011723 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011725 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011728 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:10.013107 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011733 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011737 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011739 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011742 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011745 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011749 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011753 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011755 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011758 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011761 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011763 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011766 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011769 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011774 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011778 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011781 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011783 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011786 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011788 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011791 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:10.013562 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011793 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011796 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011799 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011801 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011804 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011806 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011809 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011811 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011814 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011817 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011819 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011822 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011825 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.011827 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012881 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012892 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012899 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012904 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012909 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012912 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012916 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:02:10.014071 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012922 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012925 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012928 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012931 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012935 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012938 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012941 2578 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012944 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012947 2578 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012950 2578 flags.go:64] FLAG: --cloud-config="" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012952 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012955 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012959 2578 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012962 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012965 2578 flags.go:64] FLAG: --config-dir="" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012968 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012971 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012977 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012980 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012983 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012986 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012989 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012992 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012995 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.012998 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:02:10.014576 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013001 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013006 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013009 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013011 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013014 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013017 2578 flags.go:64] FLAG: --enable-server="true" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013020 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013024 2578 flags.go:64] FLAG: --event-burst="100" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013027 2578 flags.go:64] FLAG: --event-qps="50" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013030 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013033 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013036 2578 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013040 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013043 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013046 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013049 2578 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013052 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013055 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013058 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013061 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013063 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013066 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013069 2578 flags.go:64] FLAG: --feature-gates="" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013073 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013076 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:02:10.015189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013080 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013084 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013087 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013090 2578 flags.go:64] FLAG: --help="false" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013093 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013096 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013099 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013102 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013106 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013109 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013112 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013115 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013118 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013121 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013124 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013127 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013130 2578 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013132 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013135 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013138 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013141 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013144 2578 flags.go:64] FLAG: --lock-file="" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013146 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013149 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:02:10.015808 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013152 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013157 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013161 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013164 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013167 2578 flags.go:64] FLAG: --logging-format="text" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013169 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013173 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013175 2578 flags.go:64] FLAG: --manifest-url="" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013178 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013183 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013186 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013191 2578 flags.go:64] FLAG: --max-pods="110" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013194 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013197 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013200 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013203 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013205 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013208 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013211 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013225 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013228 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013231 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013234 2578 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:02:10.016378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013236 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013241 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013244 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013247 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013249 2578 flags.go:64] FLAG: --port="10250" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013252 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013255 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09b08c6aca1820634" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013258 2578 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013261 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013264 2578 flags.go:64] FLAG: --register-node="true" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013266 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013269 2578 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013273 2578 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013276 2578 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013279 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013282 2578 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013285 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013288 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013291 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013295 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013297 2578 flags.go:64] FLAG: --runonce="false" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013300 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013303 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013306 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013308 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013311 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:02:10.016973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013314 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013317 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013320 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013323 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013327 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013330 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013333 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013336 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013339 2578 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013341 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013347 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013350 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013353 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013357 2578 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013360 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013363 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013365 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013368 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013371 2578 flags.go:64] FLAG: --v="2" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013379 2578 flags.go:64] FLAG: --version="false" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013383 2578 flags.go:64] FLAG: --vmodule="" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013387 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.013390 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013485 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013489 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:10.017632 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013492 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013497 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013500 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013502 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013505 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013508 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013510 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013513 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013516 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013521 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013524 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013529 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013532 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013535 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013537 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013540 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013542 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013545 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013547 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013550 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:10.018277 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013553 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013555 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013562 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013565 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013568 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013571 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013574 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013577 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013579 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013583 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013585 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013588 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013590 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013592 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013596 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013615 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013620 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013624 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013627 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013630 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:10.018787 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013632 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013635 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013637 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013641 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013644 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013647 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013649 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013652 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013655 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013658 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013660 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013663 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013665 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013667 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013670 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013673 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013677 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013680 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013682 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:10.019320 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013687 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013690 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013693 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013696 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013698 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013701 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013704 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013706 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013710 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013712 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013715 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013717 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013720 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013722 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013724 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013727 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013731 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013734 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013736 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013739 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:10.019836 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013741 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:10.020315 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013744 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:10.020315 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013746 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:10.020315 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013749 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:10.020315 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.013752 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:10.020315 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.014428 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:10.020929 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.020909 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:02:10.020966 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.020930 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:02:10.020999 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.020984 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:10.020999 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.020990 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:10.020999 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.020993 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:10.020999 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.020996 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:10.020999 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.020999 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021003 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021006 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021009 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021012 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021015 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021018 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021020 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021023 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021026 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021028 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021031 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021033 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021036 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021038 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021041 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021043 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021047 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021052 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:10.021128 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021055 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021058 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021061 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021064 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021067 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021069 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021072 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021075 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021079 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021082 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021085 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021087 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021090 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021092 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021095 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021098 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021100 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021103 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021106 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021108 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:10.021594 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021111 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021114 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021116 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021119 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021121 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021124 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021126 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021129 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021132 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021134 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021137 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021140 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021143 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021145 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021147 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021150 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021152 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021155 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021157 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021160 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:10.022136 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021162 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021167 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021170 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021173 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021175 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021178 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021180 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021184 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021186 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021189 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021192 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021194 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021197 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021199 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021202 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021205 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021207 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021209 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021212 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021215 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:10.022717 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021217 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021220 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021222 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.021227 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021332 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021336 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021339 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021341 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021344 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021347 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021350 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021353 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021357 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021359 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021367 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021370 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:10.023193 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021373 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021376 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021379 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021381 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021384 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021386 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021389 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021391 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021394 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021396 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021399 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021401 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021404 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021406 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021409 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021411 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021414 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021416 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021419 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:10.023580 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021423 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021426 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021429 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021431 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021433 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021436 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021438 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021441 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021443 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021446 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021448 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021451 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021454 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021456 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021459 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021461 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021464 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021466 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021469 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021472 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:10.024050 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021474 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021476 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021479 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021482 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021484 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021486 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021489 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021491 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021494 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021496 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021499 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021502 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021506 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021508 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021511 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021514 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021517 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021519 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021522 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:10.024519 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021524 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021527 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021530 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021533 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021535 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021538 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021541 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021544 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021546 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021549 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021551 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021554 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021556 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021558 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021561 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:10.024992 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:10.021563 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:10.025357 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.021568 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:10.025357 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.022183 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:02:10.028302 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.028288 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:02:10.029414 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.029403 2578 server.go:1019] "Starting client certificate rotation" Apr 16 18:02:10.029514 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.029500 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:02:10.029548 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.029534 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:02:10.054389 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.054370 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:02:10.057874 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.057856 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:02:10.069484 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.069466 2578 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:02:10.074847 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.074825 2578 log.go:25] "Validated CRI v1 image API" Apr 16 18:02:10.076051 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.076037 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:02:10.080854 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.080836 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 d2b39e63-2034-4de0-adb7-bad48b769453:/dev/nvme0n1p3 fe0e7889-517b-4a7a-996c-2261eac99443:/dev/nvme0n1p4] Apr 16 18:02:10.080932 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.080854 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:02:10.086378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.086279 2578 manager.go:217] Machine: {Timestamp:2026-04-16 18:02:10.084397182 +0000 UTC m=+0.388130568 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3107085 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d972bec7fea63c493f73de118edad SystemUUID:ec2d972b-ec7f-ea63-c493-f73de118edad BootID:e0578b57-a085-4e74-b8d0-f80666391ef6 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:07:cb:64:db:63 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:07:cb:64:db:63 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:66:24:b7:e2:a4:28 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:02:10.086378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.086373 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:02:10.086485 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.086448 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:02:10.087575 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.087553 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:02:10.087724 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.087578 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-55.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:02:10.087773 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.087734 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:02:10.087773 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.087742 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:02:10.087773 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.087755 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:02:10.088414 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.088398 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:02:10.088446 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.088408 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:02:10.089590 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.089579 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:02:10.089712 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.089703 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:02:10.092043 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.092031 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:02:10.092087 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.092046 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:02:10.092087 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.092058 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:02:10.092087 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.092066 2578 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:02:10.092087 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.092075 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:02:10.093249 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.093229 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:02:10.093647 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.093259 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:02:10.096097 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.096080 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:02:10.098215 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.098201 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:02:10.099431 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.099416 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:02:10.099476 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.099447 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:02:10.099476 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.099456 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:02:10.099476 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.099461 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:02:10.099476 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.099466 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:02:10.099476 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.099472 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:02:10.099476 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.099478 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:02:10.099676 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.099484 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:02:10.099676 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.099491 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:02:10.099676 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.099498 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:02:10.099676 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.099511 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:02:10.099676 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.099521 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:02:10.100333 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.100322 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:02:10.100333 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.100332 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:02:10.103854 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.103841 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:02:10.103917 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.103878 2578 server.go:1295] "Started kubelet" Apr 16 18:02:10.104000 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.103973 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:02:10.104047 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.103974 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:02:10.104047 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.104037 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:02:10.104369 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.104354 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-55.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:02:10.104438 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.104412 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-55.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:02:10.104526 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.104508 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:02:10.104719 ip-10-0-134-55 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:02:10.105983 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.105964 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:02:10.107950 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.107935 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:02:10.112159 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.112132 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:02:10.112617 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.112587 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:02:10.113453 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.113434 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:02:10.113550 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.113541 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:02:10.113720 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.113440 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:02:10.114023 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.114013 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:02:10.114110 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.114091 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:02:10.115279 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.114777 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:10.115279 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.115013 2578 factory.go:55] Registering systemd factory Apr 16 18:02:10.115279 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.115070 2578 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:02:10.115880 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.115822 2578 factory.go:153] Registering CRI-O factory Apr 16 18:02:10.115880 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.115837 2578 factory.go:223] Registration of the crio container factory successfully Apr 16 18:02:10.115993 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.115898 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:02:10.115993 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.115922 2578 factory.go:103] Registering Raw factory Apr 16 18:02:10.115993 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.115964 2578 manager.go:1196] Started watching for new ooms in manager Apr 16 18:02:10.116751 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.116707 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:02:10.117030 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.117008 2578 manager.go:319] Starting recovery of all containers Apr 16 18:02:10.124728 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.124684 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:02:10.126760 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.126572 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-55.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:02:10.127029 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.126985 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:02:10.127997 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.127117 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-55.ec2.internal.18a6e84f3f2781a2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-55.ec2.internal,UID:ip-10-0-134-55.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-55.ec2.internal,},FirstTimestamp:2026-04-16 18:02:10.103853474 +0000 UTC m=+0.407586861,LastTimestamp:2026-04-16 18:02:10.103853474 +0000 UTC m=+0.407586861,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-55.ec2.internal,}" Apr 16 18:02:10.128689 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.128674 2578 manager.go:324] Recovery completed Apr 16 18:02:10.131988 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.131968 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mrmwn" Apr 16 18:02:10.132884 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.132872 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:10.135171 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.135156 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:10.135239 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.135181 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:10.135239 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.135191 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:10.135702 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.135688 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:02:10.135702 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.135703 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:02:10.135791 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.135718 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:02:10.137587 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.137576 2578 policy_none.go:49] "None policy: Start" Apr 16 18:02:10.137638 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.137591 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:02:10.137638 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.137632 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:02:10.141863 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.141805 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-55.ec2.internal.18a6e84f4105591e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-55.ec2.internal,UID:ip-10-0-134-55.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-55.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-55.ec2.internal,},FirstTimestamp:2026-04-16 18:02:10.13516931 +0000 UTC m=+0.438902696,LastTimestamp:2026-04-16 18:02:10.13516931 +0000 UTC m=+0.438902696,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-55.ec2.internal,}" Apr 16 18:02:10.148589 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.148574 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mrmwn" Apr 16 18:02:10.179158 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.179142 2578 manager.go:341] "Starting Device Plugin manager" Apr 16 18:02:10.187308 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.179175 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:02:10.187308 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.179188 2578 server.go:85] "Starting device plugin registration server" Apr 16 18:02:10.187308 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.179380 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:02:10.187308 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.179390 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:02:10.187308 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.179478 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:02:10.187308 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.179548 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:02:10.187308 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.179556 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:02:10.187308 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.180052 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:02:10.187308 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.180085 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:10.250031 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.250008 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:02:10.250180 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.250043 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:02:10.250180 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.250066 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:02:10.250180 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.250076 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:02:10.250180 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.250117 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:02:10.252316 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.252299 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:10.280115 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.280066 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:10.280838 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.280823 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:10.280904 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.280851 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:10.280904 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.280864 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:10.280904 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.280887 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.289697 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.289682 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.289741 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.289704 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-55.ec2.internal\": node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:10.307398 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.307376 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:10.350380 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.350359 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-55.ec2.internal"] Apr 16 18:02:10.350462 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.350416 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:10.351905 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.351892 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:10.351977 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.351917 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:10.351977 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.351926 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:10.352910 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.352896 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:10.353067 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.353053 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.353116 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.353082 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:10.353650 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.353635 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:10.353731 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.353659 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:10.353731 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.353666 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:10.353731 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.353674 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:10.353731 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.353685 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:10.353731 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.353696 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:10.355203 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.355190 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.355252 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.355215 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:10.356089 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.356066 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:10.356089 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.356089 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:10.356205 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.356103 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:10.377140 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.377122 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-55.ec2.internal\" not found" node="ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.382298 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.382277 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-55.ec2.internal\" not found" node="ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.407721 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.407698 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:10.415101 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.415069 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5883b1b032604264e78d8f20b1008666-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal\" (UID: \"5883b1b032604264e78d8f20b1008666\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.508219 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.508188 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:10.515727 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.515704 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5883b1b032604264e78d8f20b1008666-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal\" (UID: \"5883b1b032604264e78d8f20b1008666\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.515805 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.515736 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5883b1b032604264e78d8f20b1008666-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal\" (UID: \"5883b1b032604264e78d8f20b1008666\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.515805 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.515757 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/eceb33926282e4ef98e3ef9d13dc120a-config\") pod \"kube-apiserver-proxy-ip-10-0-134-55.ec2.internal\" (UID: \"eceb33926282e4ef98e3ef9d13dc120a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.515909 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.515863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5883b1b032604264e78d8f20b1008666-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal\" (UID: \"5883b1b032604264e78d8f20b1008666\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.609247 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.609177 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:10.616569 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.616547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5883b1b032604264e78d8f20b1008666-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal\" (UID: \"5883b1b032604264e78d8f20b1008666\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.616652 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.616573 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/eceb33926282e4ef98e3ef9d13dc120a-config\") pod \"kube-apiserver-proxy-ip-10-0-134-55.ec2.internal\" (UID: \"eceb33926282e4ef98e3ef9d13dc120a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.616652 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.616646 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/eceb33926282e4ef98e3ef9d13dc120a-config\") pod \"kube-apiserver-proxy-ip-10-0-134-55.ec2.internal\" (UID: \"eceb33926282e4ef98e3ef9d13dc120a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.616721 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.616656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5883b1b032604264e78d8f20b1008666-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal\" (UID: \"5883b1b032604264e78d8f20b1008666\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.680763 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.680731 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.684323 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:10.684304 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-55.ec2.internal" Apr 16 18:02:10.709834 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.709807 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:10.810461 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.810411 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:10.911034 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:10.910968 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:11.011655 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:11.011634 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:11.029173 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.029152 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:02:11.029291 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.029275 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:02:11.113104 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:11.113071 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:11.113104 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.113088 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:02:11.126635 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.126595 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:02:11.150464 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.150433 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:57:10 +0000 UTC" deadline="2027-10-02 11:15:16.819593165 +0000 UTC" Apr 16 18:02:11.150464 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.150460 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12809h13m5.669136472s" Apr 16 18:02:11.152276 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.152257 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-j5ww2" Apr 16 18:02:11.160114 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.160097 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-j5ww2" Apr 16 18:02:11.167379 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:11.167328 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeceb33926282e4ef98e3ef9d13dc120a.slice/crio-b1f2d135dd000b5da84e0abb92b586bd23e2ab6011469f8a5d4365836259ff9e WatchSource:0}: Error finding container b1f2d135dd000b5da84e0abb92b586bd23e2ab6011469f8a5d4365836259ff9e: Status 404 returned error can't find the container with id b1f2d135dd000b5da84e0abb92b586bd23e2ab6011469f8a5d4365836259ff9e Apr 16 18:02:11.167900 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:11.167882 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5883b1b032604264e78d8f20b1008666.slice/crio-1c3d6db09b44e9041625622a24871091a90878ae7c45c730dc296b6cde10a0ec WatchSource:0}: Error finding container 1c3d6db09b44e9041625622a24871091a90878ae7c45c730dc296b6cde10a0ec: Status 404 returned error can't find the container with id 1c3d6db09b44e9041625622a24871091a90878ae7c45c730dc296b6cde10a0ec Apr 16 18:02:11.174270 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.174245 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:02:11.210615 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.210586 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:11.213147 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:11.213121 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:11.253305 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.253257 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-55.ec2.internal" event={"ID":"eceb33926282e4ef98e3ef9d13dc120a","Type":"ContainerStarted","Data":"b1f2d135dd000b5da84e0abb92b586bd23e2ab6011469f8a5d4365836259ff9e"} Apr 16 18:02:11.254147 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.254122 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal" event={"ID":"5883b1b032604264e78d8f20b1008666","Type":"ContainerStarted","Data":"1c3d6db09b44e9041625622a24871091a90878ae7c45c730dc296b6cde10a0ec"} Apr 16 18:02:11.268831 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.268813 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:11.313208 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:11.313179 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:11.413618 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:11.413576 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:11.514153 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:11.514069 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-55.ec2.internal\" not found" Apr 16 18:02:11.605681 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.605653 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:11.612997 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.612968 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-55.ec2.internal" Apr 16 18:02:11.624752 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.624635 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:11.625455 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.625431 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal" Apr 16 18:02:11.640453 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.640428 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:11.945675 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:11.945589 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:12.092565 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.092535 2578 apiserver.go:52] "Watching apiserver" Apr 16 18:02:12.102281 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.102259 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:02:12.104132 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.104106 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vnntp","openshift-network-diagnostics/network-check-target-r8w9f","openshift-network-operator/iptables-alerter-s6vmk","openshift-ovn-kubernetes/ovnkube-node-r77l7","kube-system/konnectivity-agent-9gw8r","kube-system/kube-apiserver-proxy-ip-10-0-134-55.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf","openshift-dns/node-resolver-lgc4z","openshift-image-registry/node-ca-85mvt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal","openshift-multus/multus-additional-cni-plugins-dgmt6","openshift-cluster-node-tuning-operator/tuned-x7cz7","openshift-multus/multus-nc622"] Apr 16 18:02:12.106942 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.106921 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.108442 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.107984 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:12.108442 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.108068 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:12.108442 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.108080 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s6vmk" Apr 16 18:02:12.109310 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.109292 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.110370 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.110353 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:02:12.110462 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.110368 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:02:12.110462 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.110382 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9gw8r" Apr 16 18:02:12.110731 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.110714 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-29h8d\"" Apr 16 18:02:12.110947 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.110926 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:02:12.111392 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.111375 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:02:12.111503 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.111487 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:12.111567 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.111516 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:12.111741 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.111639 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xs8hp\"" Apr 16 18:02:12.111870 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.111847 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:02:12.112019 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.111860 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:12.112019 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.111992 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:12.112217 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.112203 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-cj8nh\"" Apr 16 18:02:12.112274 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.112226 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:02:12.112274 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.112204 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:02:12.113307 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.113285 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lgc4z" Apr 16 18:02:12.116697 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.114746 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:02:12.116697 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.114895 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:02:12.116697 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.114752 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:02:12.116697 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.115679 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-85mvt" Apr 16 18:02:12.116697 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.116046 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:02:12.116697 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.116447 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:02:12.116697 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.116549 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-2wzvj\"" Apr 16 18:02:12.117862 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.117843 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:02:12.117862 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.117853 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:02:12.118004 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.117930 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nqx78\"" Apr 16 18:02:12.118277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.118261 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:02:12.118510 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.118496 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-gws24\"" Apr 16 18:02:12.119017 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.118996 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.119158 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.119137 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.119369 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.119354 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:02:12.120272 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.120248 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nc622" Apr 16 18:02:12.124636 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124614 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-var-lib-openvswitch\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.124724 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124657 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-registration-dir\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.124724 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124692 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7430323d-b0e8-4d21-9d45-f29e8e9f976f-host-slash\") pod \"iptables-alerter-s6vmk\" (UID: \"7430323d-b0e8-4d21-9d45-f29e8e9f976f\") " pod="openshift-network-operator/iptables-alerter-s6vmk" Apr 16 18:02:12.124835 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124721 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-systemd-units\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.124835 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124747 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c11179d3-6347-4962-9a7f-b134abd01cf4-ovn-node-metrics-cert\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.124835 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124770 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:12.124835 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-kubelet\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.124835 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124817 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-run-openvswitch\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.125009 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124840 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-run-ovn\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.125009 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124868 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.125009 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124895 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c11179d3-6347-4962-9a7f-b134abd01cf4-env-overrides\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.125009 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124918 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzd98\" (UniqueName: \"kubernetes.io/projected/179b0e84-c4b0-49f4-af8c-88127e5c999a-kube-api-access-rzd98\") pod \"node-resolver-lgc4z\" (UID: \"179b0e84-c4b0-49f4-af8c-88127e5c999a\") " pod="openshift-dns/node-resolver-lgc4z" Apr 16 18:02:12.125009 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124941 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.125009 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.124983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-etc-selinux\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.125269 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125040 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcf99\" (UniqueName: \"kubernetes.io/projected/35df5907-2c78-469e-9cb6-41fb0dfca177-kube-api-access-lcf99\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.125269 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125078 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-run-systemd\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.125269 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-log-socket\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.125269 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125165 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-cni-netd\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.125269 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125200 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3b31ea4d-734b-4f90-a86a-3706e8c5d58f-agent-certs\") pod \"konnectivity-agent-9gw8r\" (UID: \"3b31ea4d-734b-4f90-a86a-3706e8c5d58f\") " pod="kube-system/konnectivity-agent-9gw8r" Apr 16 18:02:12.125269 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125204 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:02:12.125269 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125245 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:02:12.125269 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125249 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mfkt\" (UniqueName: \"kubernetes.io/projected/e2c3dbfa-ec70-406e-af1c-6ac3c1aba031-kube-api-access-8mfkt\") pod \"node-ca-85mvt\" (UID: \"e2c3dbfa-ec70-406e-af1c-6ac3c1aba031\") " pod="openshift-image-registry/node-ca-85mvt" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125278 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-device-dir\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125284 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125312 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l98t\" (UniqueName: \"kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t\") pod \"network-check-target-r8w9f\" (UID: \"378c671e-1d04-4256-a001-a7ce2a0d3b86\") " pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125342 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125345 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-run-netns\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125358 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125368 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvfvf\" (UniqueName: \"kubernetes.io/projected/e1d07798-4149-483d-a793-b43a2b14fdbf-kube-api-access-nvfvf\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125400 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2c3dbfa-ec70-406e-af1c-6ac3c1aba031-host\") pod \"node-ca-85mvt\" (UID: \"e2c3dbfa-ec70-406e-af1c-6ac3c1aba031\") " pod="openshift-image-registry/node-ca-85mvt" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125412 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125468 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/179b0e84-c4b0-49f4-af8c-88127e5c999a-hosts-file\") pod \"node-resolver-lgc4z\" (UID: \"179b0e84-c4b0-49f4-af8c-88127e5c999a\") " pod="openshift-dns/node-resolver-lgc4z" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125497 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-sys-fs\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125506 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125522 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7430323d-b0e8-4d21-9d45-f29e8e9f976f-iptables-alerter-script\") pod \"iptables-alerter-s6vmk\" (UID: \"7430323d-b0e8-4d21-9d45-f29e8e9f976f\") " pod="openshift-network-operator/iptables-alerter-s6vmk" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125550 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4w99\" (UniqueName: \"kubernetes.io/projected/7430323d-b0e8-4d21-9d45-f29e8e9f976f-kube-api-access-f4w99\") pod \"iptables-alerter-s6vmk\" (UID: \"7430323d-b0e8-4d21-9d45-f29e8e9f976f\") " pod="openshift-network-operator/iptables-alerter-s6vmk" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125571 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-slash\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-node-log\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnvzw\" (UniqueName: \"kubernetes.io/projected/c11179d3-6347-4962-9a7f-b134abd01cf4-kube-api-access-gnvzw\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-etc-openvswitch\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.125714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125708 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kbls5\"" Apr 16 18:02:12.126367 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125706 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c11179d3-6347-4962-9a7f-b134abd01cf4-ovnkube-config\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.126367 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125735 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:12.126367 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3b31ea4d-734b-4f90-a86a-3706e8c5d58f-konnectivity-ca\") pod \"konnectivity-agent-9gw8r\" (UID: \"3b31ea4d-734b-4f90-a86a-3706e8c5d58f\") " pod="kube-system/konnectivity-agent-9gw8r" Apr 16 18:02:12.126367 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125766 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2c3dbfa-ec70-406e-af1c-6ac3c1aba031-serviceca\") pod \"node-ca-85mvt\" (UID: \"e2c3dbfa-ec70-406e-af1c-6ac3c1aba031\") " pod="openshift-image-registry/node-ca-85mvt" Apr 16 18:02:12.126367 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125806 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:02:12.126367 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125740 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-r9z45\"" Apr 16 18:02:12.126367 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125804 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-socket-dir\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.126367 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-run-ovn-kubernetes\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.126367 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125927 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-cni-bin\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.126367 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c11179d3-6347-4962-9a7f-b134abd01cf4-ovnkube-script-lib\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.126367 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.125984 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/179b0e84-c4b0-49f4-af8c-88127e5c999a-tmp-dir\") pod \"node-resolver-lgc4z\" (UID: \"179b0e84-c4b0-49f4-af8c-88127e5c999a\") " pod="openshift-dns/node-resolver-lgc4z" Apr 16 18:02:12.133884 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.133866 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-x6sn6\"" Apr 16 18:02:12.161159 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.161136 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:11 +0000 UTC" deadline="2027-10-25 19:28:04.488640652 +0000 UTC" Apr 16 18:02:12.161159 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.161159 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13369h25m52.327485061s" Apr 16 18:02:12.215560 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.215480 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:02:12.226316 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226289 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnvzw\" (UniqueName: \"kubernetes.io/projected/c11179d3-6347-4962-9a7f-b134abd01cf4-kube-api-access-gnvzw\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.226457 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226324 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-multus-conf-dir\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.226457 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-run-multus-certs\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.226457 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226396 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2c3dbfa-ec70-406e-af1c-6ac3c1aba031-serviceca\") pod \"node-ca-85mvt\" (UID: \"e2c3dbfa-ec70-406e-af1c-6ac3c1aba031\") " pod="openshift-image-registry/node-ca-85mvt" Apr 16 18:02:12.226457 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226421 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d9e5152-75a1-43b4-9c2b-f42acb2075df-cnibin\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.226457 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226442 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d9e5152-75a1-43b4-9c2b-f42acb2075df-os-release\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.226725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226467 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-etc-kubernetes\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.226725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-sys\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.226725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-socket-dir\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.226725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226537 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-run-ovn-kubernetes\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.226725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c11179d3-6347-4962-9a7f-b134abd01cf4-ovnkube-script-lib\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.226725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226629 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-var-lib-openvswitch\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.226725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d9e5152-75a1-43b4-9c2b-f42acb2075df-system-cni-dir\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.226725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226699 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-var-lib-openvswitch\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.226725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226706 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-lib-modules\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.227108 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-registration-dir\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.227108 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.226770 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7430323d-b0e8-4d21-9d45-f29e8e9f976f-host-slash\") pod \"iptables-alerter-s6vmk\" (UID: \"7430323d-b0e8-4d21-9d45-f29e8e9f976f\") " pod="openshift-network-operator/iptables-alerter-s6vmk" Apr 16 18:02:12.227207 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.227142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-systemd-units\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.227207 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.227176 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vn8q\" (UniqueName: \"kubernetes.io/projected/2d9e5152-75a1-43b4-9c2b-f42acb2075df-kube-api-access-8vn8q\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.227350 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.227325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-run-ovn-kubernetes\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.227472 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.227451 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2c3dbfa-ec70-406e-af1c-6ac3c1aba031-serviceca\") pod \"node-ca-85mvt\" (UID: \"e2c3dbfa-ec70-406e-af1c-6ac3c1aba031\") " pod="openshift-image-registry/node-ca-85mvt" Apr 16 18:02:12.227530 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.227499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-registration-dir\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.227650 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.227633 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-socket-dir\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.227707 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.227639 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7430323d-b0e8-4d21-9d45-f29e8e9f976f-host-slash\") pod \"iptables-alerter-s6vmk\" (UID: \"7430323d-b0e8-4d21-9d45-f29e8e9f976f\") " pod="openshift-network-operator/iptables-alerter-s6vmk" Apr 16 18:02:12.227756 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.227704 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-run-netns\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.227801 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.227768 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzd98\" (UniqueName: \"kubernetes.io/projected/179b0e84-c4b0-49f4-af8c-88127e5c999a-kube-api-access-rzd98\") pod \"node-resolver-lgc4z\" (UID: \"179b0e84-c4b0-49f4-af8c-88127e5c999a\") " pod="openshift-dns/node-resolver-lgc4z" Apr 16 18:02:12.227846 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.227780 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-systemd-units\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.227901 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.227850 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-cnibin\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.227945 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.227932 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-hostroot\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.227992 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.227968 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3b31ea4d-734b-4f90-a86a-3706e8c5d58f-agent-certs\") pod \"konnectivity-agent-9gw8r\" (UID: \"3b31ea4d-734b-4f90-a86a-3706e8c5d58f\") " pod="kube-system/konnectivity-agent-9gw8r" Apr 16 18:02:12.228060 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228045 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/526fd613-ae22-4771-a739-ad3226226ba3-multus-daemon-config\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.228104 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-sysconfig\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.228104 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228096 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c11179d3-6347-4962-9a7f-b134abd01cf4-ovnkube-script-lib\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.228195 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228113 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-device-dir\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvfvf\" (UniqueName: \"kubernetes.io/projected/e1d07798-4149-483d-a793-b43a2b14fdbf-kube-api-access-nvfvf\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228283 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-device-dir\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228328 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2c3dbfa-ec70-406e-af1c-6ac3c1aba031-host\") pod \"node-ca-85mvt\" (UID: \"e2c3dbfa-ec70-406e-af1c-6ac3c1aba031\") " pod="openshift-image-registry/node-ca-85mvt" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228405 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-sysctl-conf\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228514 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228566 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-var-lib-kubelet\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228627 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-slash\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228706 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2c3dbfa-ec70-406e-af1c-6ac3c1aba031-host\") pod \"node-ca-85mvt\" (UID: \"e2c3dbfa-ec70-406e-af1c-6ac3c1aba031\") " pod="openshift-image-registry/node-ca-85mvt" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-node-log\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228805 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-slash\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228799 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2d9e5152-75a1-43b4-9c2b-f42acb2075df-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228860 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-node-log\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228867 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-var-lib-cni-bin\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228915 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-sysctl-d\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228960 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-etc-openvswitch\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.228986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c11179d3-6347-4962-9a7f-b134abd01cf4-ovnkube-config\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.230718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.229041 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3b31ea4d-734b-4f90-a86a-3706e8c5d58f-konnectivity-ca\") pod \"konnectivity-agent-9gw8r\" (UID: \"3b31ea4d-734b-4f90-a86a-3706e8c5d58f\") " pod="kube-system/konnectivity-agent-9gw8r" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.229085 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d9e5152-75a1-43b4-9c2b-f42acb2075df-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.229144 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-system-cni-dir\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.229165 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-etc-openvswitch\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.229216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-run-k8s-cni-cncf-io\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.229258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-cni-bin\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.229291 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/179b0e84-c4b0-49f4-af8c-88127e5c999a-tmp-dir\") pod \"node-resolver-lgc4z\" (UID: \"179b0e84-c4b0-49f4-af8c-88127e5c999a\") " pod="openshift-dns/node-resolver-lgc4z" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.229809 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3b31ea4d-734b-4f90-a86a-3706e8c5d58f-konnectivity-ca\") pod \"konnectivity-agent-9gw8r\" (UID: \"3b31ea4d-734b-4f90-a86a-3706e8c5d58f\") " pod="kube-system/konnectivity-agent-9gw8r" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.230292 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c11179d3-6347-4962-9a7f-b134abd01cf4-ovnkube-config\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.230297 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d9e5152-75a1-43b4-9c2b-f42acb2075df-cni-binary-copy\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.230395 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-systemd\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.230488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-var-lib-cni-multus\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.230518 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-host\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.230525 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-cni-bin\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.230542 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-tmp\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.230576 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c11179d3-6347-4962-9a7f-b134abd01cf4-ovn-node-metrics-cert\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.230626 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:12.231581 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.230747 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2d9e5152-75a1-43b4-9c2b-f42acb2075df-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.230775 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/179b0e84-c4b0-49f4-af8c-88127e5c999a-tmp-dir\") pod \"node-resolver-lgc4z\" (UID: \"179b0e84-c4b0-49f4-af8c-88127e5c999a\") " pod="openshift-dns/node-resolver-lgc4z" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.230793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-multus-cni-dir\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.230839 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.230841 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-os-release\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.231022 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs podName:e1d07798-4149-483d-a793-b43a2b14fdbf nodeName:}" failed. No retries permitted until 2026-04-16 18:02:12.730895715 +0000 UTC m=+3.034629107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs") pod "network-metrics-daemon-vnntp" (UID: "e1d07798-4149-483d-a793-b43a2b14fdbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231052 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-run\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231118 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-tuned\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231160 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-kubelet\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231180 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-run-openvswitch\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231199 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-run-ovn\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231219 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231240 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c11179d3-6347-4962-9a7f-b134abd01cf4-env-overrides\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231309 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/526fd613-ae22-4771-a739-ad3226226ba3-cni-binary-copy\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231342 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gltrx\" (UniqueName: \"kubernetes.io/projected/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-kube-api-access-gltrx\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231413 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-etc-selinux\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.232364 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231448 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcf99\" (UniqueName: \"kubernetes.io/projected/35df5907-2c78-469e-9cb6-41fb0dfca177-kube-api-access-lcf99\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-run-systemd\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231714 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-log-socket\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231574 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-run-openvswitch\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231646 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231516 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c11179d3-6347-4962-9a7f-b134abd01cf4-env-overrides\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231751 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-cni-netd\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mfkt\" (UniqueName: \"kubernetes.io/projected/e2c3dbfa-ec70-406e-af1c-6ac3c1aba031-kube-api-access-8mfkt\") pod \"node-ca-85mvt\" (UID: \"e2c3dbfa-ec70-406e-af1c-6ac3c1aba031\") " pod="openshift-image-registry/node-ca-85mvt" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231815 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-run-systemd\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-kubelet\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231852 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-log-socket\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231879 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-cni-netd\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231885 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-etc-selinux\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231922 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-run-ovn\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.231975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-multus-socket-dir-parent\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l98t\" (UniqueName: \"kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t\") pod \"network-check-target-r8w9f\" (UID: \"378c671e-1d04-4256-a001-a7ce2a0d3b86\") " pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:12.232991 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-run-netns\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232116 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/179b0e84-c4b0-49f4-af8c-88127e5c999a-hosts-file\") pod \"node-resolver-lgc4z\" (UID: \"179b0e84-c4b0-49f4-af8c-88127e5c999a\") " pod="openshift-dns/node-resolver-lgc4z" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-var-lib-kubelet\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232181 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbsz7\" (UniqueName: \"kubernetes.io/projected/526fd613-ae22-4771-a739-ad3226226ba3-kube-api-access-dbsz7\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232193 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c11179d3-6347-4962-9a7f-b134abd01cf4-host-run-netns\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232214 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-modprobe-d\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232272 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-kubernetes\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232352 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/179b0e84-c4b0-49f4-af8c-88127e5c999a-hosts-file\") pod \"node-resolver-lgc4z\" (UID: \"179b0e84-c4b0-49f4-af8c-88127e5c999a\") " pod="openshift-dns/node-resolver-lgc4z" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-sys-fs\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232504 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35df5907-2c78-469e-9cb6-41fb0dfca177-sys-fs\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232543 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7430323d-b0e8-4d21-9d45-f29e8e9f976f-iptables-alerter-script\") pod \"iptables-alerter-s6vmk\" (UID: \"7430323d-b0e8-4d21-9d45-f29e8e9f976f\") " pod="openshift-network-operator/iptables-alerter-s6vmk" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232584 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4w99\" (UniqueName: \"kubernetes.io/projected/7430323d-b0e8-4d21-9d45-f29e8e9f976f-kube-api-access-f4w99\") pod \"iptables-alerter-s6vmk\" (UID: \"7430323d-b0e8-4d21-9d45-f29e8e9f976f\") " pod="openshift-network-operator/iptables-alerter-s6vmk" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.232724 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3b31ea4d-734b-4f90-a86a-3706e8c5d58f-agent-certs\") pod \"konnectivity-agent-9gw8r\" (UID: \"3b31ea4d-734b-4f90-a86a-3706e8c5d58f\") " pod="kube-system/konnectivity-agent-9gw8r" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.233280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7430323d-b0e8-4d21-9d45-f29e8e9f976f-iptables-alerter-script\") pod \"iptables-alerter-s6vmk\" (UID: \"7430323d-b0e8-4d21-9d45-f29e8e9f976f\") " pod="openshift-network-operator/iptables-alerter-s6vmk" Apr 16 18:02:12.233527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.233306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c11179d3-6347-4962-9a7f-b134abd01cf4-ovn-node-metrics-cert\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.239128 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.239109 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:12.239128 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.239130 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:12.239292 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.239144 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8l98t for pod openshift-network-diagnostics/network-check-target-r8w9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:12.239292 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.239210 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzd98\" (UniqueName: \"kubernetes.io/projected/179b0e84-c4b0-49f4-af8c-88127e5c999a-kube-api-access-rzd98\") pod \"node-resolver-lgc4z\" (UID: \"179b0e84-c4b0-49f4-af8c-88127e5c999a\") " pod="openshift-dns/node-resolver-lgc4z" Apr 16 18:02:12.239292 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.239223 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t podName:378c671e-1d04-4256-a001-a7ce2a0d3b86 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:12.739205034 +0000 UTC m=+3.042938410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8l98t" (UniqueName: "kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t") pod "network-check-target-r8w9f" (UID: "378c671e-1d04-4256-a001-a7ce2a0d3b86") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:12.239751 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.239715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnvzw\" (UniqueName: \"kubernetes.io/projected/c11179d3-6347-4962-9a7f-b134abd01cf4-kube-api-access-gnvzw\") pod \"ovnkube-node-r77l7\" (UID: \"c11179d3-6347-4962-9a7f-b134abd01cf4\") " pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.240014 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.239963 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvfvf\" (UniqueName: \"kubernetes.io/projected/e1d07798-4149-483d-a793-b43a2b14fdbf-kube-api-access-nvfvf\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:12.241515 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.241492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mfkt\" (UniqueName: \"kubernetes.io/projected/e2c3dbfa-ec70-406e-af1c-6ac3c1aba031-kube-api-access-8mfkt\") pod \"node-ca-85mvt\" (UID: \"e2c3dbfa-ec70-406e-af1c-6ac3c1aba031\") " pod="openshift-image-registry/node-ca-85mvt" Apr 16 18:02:12.242310 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.242291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4w99\" (UniqueName: \"kubernetes.io/projected/7430323d-b0e8-4d21-9d45-f29e8e9f976f-kube-api-access-f4w99\") pod \"iptables-alerter-s6vmk\" (UID: \"7430323d-b0e8-4d21-9d45-f29e8e9f976f\") " pod="openshift-network-operator/iptables-alerter-s6vmk" Apr 16 18:02:12.243015 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.242996 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcf99\" (UniqueName: \"kubernetes.io/projected/35df5907-2c78-469e-9cb6-41fb0dfca177-kube-api-access-lcf99\") pod \"aws-ebs-csi-driver-node-8hvvf\" (UID: \"35df5907-2c78-469e-9cb6-41fb0dfca177\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.333170 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333115 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-run-k8s-cni-cncf-io\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333170 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333169 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d9e5152-75a1-43b4-9c2b-f42acb2075df-cni-binary-copy\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.333376 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333197 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-systemd\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.333376 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-var-lib-cni-multus\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333376 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333237 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-run-k8s-cni-cncf-io\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333376 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333249 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-host\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.333376 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333284 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-tmp\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.333376 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333296 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-var-lib-cni-multus\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333376 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333314 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2d9e5152-75a1-43b4-9c2b-f42acb2075df-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.333376 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333337 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-multus-cni-dir\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333376 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333346 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-host\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.333376 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333362 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-os-release\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333376 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-run\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333399 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-tuned\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/526fd613-ae22-4771-a739-ad3226226ba3-cni-binary-copy\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gltrx\" (UniqueName: \"kubernetes.io/projected/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-kube-api-access-gltrx\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-multus-socket-dir-parent\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333525 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-var-lib-kubelet\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333553 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbsz7\" (UniqueName: \"kubernetes.io/projected/526fd613-ae22-4771-a739-ad3226226ba3-kube-api-access-dbsz7\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333576 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-modprobe-d\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333614 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-kubernetes\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333653 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-multus-conf-dir\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-systemd\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333731 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-run-multus-certs\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333787 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-var-lib-kubelet\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-run-multus-certs\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d9e5152-75a1-43b4-9c2b-f42acb2075df-cni-binary-copy\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333901 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-kubernetes\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.333907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333907 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-multus-socket-dir-parent\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d9e5152-75a1-43b4-9c2b-f42acb2075df-cnibin\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-multus-conf-dir\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333980 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d9e5152-75a1-43b4-9c2b-f42acb2075df-os-release\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.333997 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-run\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334005 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-etc-kubernetes\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334032 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-sys\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334038 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-modprobe-d\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334034 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-multus-cni-dir\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334069 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d9e5152-75a1-43b4-9c2b-f42acb2075df-system-cni-dir\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-lib-modules\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334100 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-os-release\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334124 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vn8q\" (UniqueName: \"kubernetes.io/projected/2d9e5152-75a1-43b4-9c2b-f42acb2075df-kube-api-access-8vn8q\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-run-netns\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334179 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-cnibin\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334203 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-hostroot\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334230 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/526fd613-ae22-4771-a739-ad3226226ba3-multus-daemon-config\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334254 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d9e5152-75a1-43b4-9c2b-f42acb2075df-system-cni-dir\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.334597 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334311 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-sysconfig\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334430 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-lib-modules\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334427 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2d9e5152-75a1-43b4-9c2b-f42acb2075df-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334481 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-etc-kubernetes\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-sys\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334525 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d9e5152-75a1-43b4-9c2b-f42acb2075df-cnibin\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334588 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d9e5152-75a1-43b4-9c2b-f42acb2075df-os-release\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334589 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-run-netns\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334666 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-cnibin\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334673 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-hostroot\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-sysconfig\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334717 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-sysctl-conf\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-var-lib-kubelet\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2d9e5152-75a1-43b4-9c2b-f42acb2075df-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-var-lib-cni-bin\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-sysctl-d\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334851 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d9e5152-75a1-43b4-9c2b-f42acb2075df-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334877 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-system-cni-dir\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.335277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.334949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-system-cni-dir\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.335995 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.335070 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-sysctl-conf\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.335995 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.335091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/526fd613-ae22-4771-a739-ad3226226ba3-host-var-lib-cni-bin\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.335995 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.335121 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-var-lib-kubelet\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.335995 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.335197 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-sysctl-d\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.335995 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.335520 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2d9e5152-75a1-43b4-9c2b-f42acb2075df-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.335995 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.335708 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d9e5152-75a1-43b4-9c2b-f42acb2075df-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.335995 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.335806 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/526fd613-ae22-4771-a739-ad3226226ba3-multus-daemon-config\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.335995 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.335930 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/526fd613-ae22-4771-a739-ad3226226ba3-cni-binary-copy\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.336352 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.336336 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-tmp\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.336498 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.336477 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-etc-tuned\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.342453 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.342060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vn8q\" (UniqueName: \"kubernetes.io/projected/2d9e5152-75a1-43b4-9c2b-f42acb2075df-kube-api-access-8vn8q\") pod \"multus-additional-cni-plugins-dgmt6\" (UID: \"2d9e5152-75a1-43b4-9c2b-f42acb2075df\") " pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.342453 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.342382 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gltrx\" (UniqueName: \"kubernetes.io/projected/53e0cdeb-435b-4719-b0de-3d6d77d6a1db-kube-api-access-gltrx\") pod \"tuned-x7cz7\" (UID: \"53e0cdeb-435b-4719-b0de-3d6d77d6a1db\") " pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.342453 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.342414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbsz7\" (UniqueName: \"kubernetes.io/projected/526fd613-ae22-4771-a739-ad3226226ba3-kube-api-access-dbsz7\") pod \"multus-nc622\" (UID: \"526fd613-ae22-4771-a739-ad3226226ba3\") " pod="openshift-multus/multus-nc622" Apr 16 18:02:12.421534 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.421501 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" Apr 16 18:02:12.429169 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.429147 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s6vmk" Apr 16 18:02:12.436862 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.436842 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:12.442442 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.442423 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9gw8r" Apr 16 18:02:12.447923 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.447907 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lgc4z" Apr 16 18:02:12.454407 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.454390 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-85mvt" Apr 16 18:02:12.462954 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.462937 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dgmt6" Apr 16 18:02:12.467506 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.467450 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" Apr 16 18:02:12.471948 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.471933 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nc622" Apr 16 18:02:12.738627 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.738531 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:12.738777 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.738673 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:12.738777 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.738746 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs podName:e1d07798-4149-483d-a793-b43a2b14fdbf nodeName:}" failed. No retries permitted until 2026-04-16 18:02:13.738727859 +0000 UTC m=+4.042461252 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs") pod "network-metrics-daemon-vnntp" (UID: "e1d07798-4149-483d-a793-b43a2b14fdbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:12.829791 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:12.829765 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2c3dbfa_ec70_406e_af1c_6ac3c1aba031.slice/crio-8f9c66a384185c5331266397edaa0f942a8a0938f76a05fe3948265273d2750d WatchSource:0}: Error finding container 8f9c66a384185c5331266397edaa0f942a8a0938f76a05fe3948265273d2750d: Status 404 returned error can't find the container with id 8f9c66a384185c5331266397edaa0f942a8a0938f76a05fe3948265273d2750d Apr 16 18:02:12.831448 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:12.831041 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod526fd613_ae22_4771_a739_ad3226226ba3.slice/crio-69d02dea871296420779da3c6f338fdda721df7e4d673d22489b8f5fe6735c3e WatchSource:0}: Error finding container 69d02dea871296420779da3c6f338fdda721df7e4d673d22489b8f5fe6735c3e: Status 404 returned error can't find the container with id 69d02dea871296420779da3c6f338fdda721df7e4d673d22489b8f5fe6735c3e Apr 16 18:02:12.837175 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:12.837147 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b31ea4d_734b_4f90_a86a_3706e8c5d58f.slice/crio-65b3b6a7d316888393a04f80234ff7fd6b232a72ac1298a1edbd60051266e678 WatchSource:0}: Error finding container 65b3b6a7d316888393a04f80234ff7fd6b232a72ac1298a1edbd60051266e678: Status 404 returned error can't find the container with id 65b3b6a7d316888393a04f80234ff7fd6b232a72ac1298a1edbd60051266e678 Apr 16 18:02:12.838386 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:12.838367 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc11179d3_6347_4962_9a7f_b134abd01cf4.slice/crio-06322097be4ada68377b3deb94e57d56e19b3d66543db14468879aaced189e29 WatchSource:0}: Error finding container 06322097be4ada68377b3deb94e57d56e19b3d66543db14468879aaced189e29: Status 404 returned error can't find the container with id 06322097be4ada68377b3deb94e57d56e19b3d66543db14468879aaced189e29 Apr 16 18:02:12.839117 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:12.839088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l98t\" (UniqueName: \"kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t\") pod \"network-check-target-r8w9f\" (UID: \"378c671e-1d04-4256-a001-a7ce2a0d3b86\") " pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:12.839246 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.839230 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:12.839320 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.839248 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:12.839320 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.839257 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8l98t for pod openshift-network-diagnostics/network-check-target-r8w9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:12.839320 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:12.839292 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t podName:378c671e-1d04-4256-a001-a7ce2a0d3b86 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:13.839278777 +0000 UTC m=+4.143012151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8l98t" (UniqueName: "kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t") pod "network-check-target-r8w9f" (UID: "378c671e-1d04-4256-a001-a7ce2a0d3b86") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:12.840961 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:12.840919 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d9e5152_75a1_43b4_9c2b_f42acb2075df.slice/crio-013ec67c3c3356bb48a7f7697edefe42097498aae29b5043e87526d50905040f WatchSource:0}: Error finding container 013ec67c3c3356bb48a7f7697edefe42097498aae29b5043e87526d50905040f: Status 404 returned error can't find the container with id 013ec67c3c3356bb48a7f7697edefe42097498aae29b5043e87526d50905040f Apr 16 18:02:12.841837 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:12.841752 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35df5907_2c78_469e_9cb6_41fb0dfca177.slice/crio-062abe2db4d68cb3d06515bb20378a08ff6681318e2e1781ffc202a10ac32630 WatchSource:0}: Error finding container 062abe2db4d68cb3d06515bb20378a08ff6681318e2e1781ffc202a10ac32630: Status 404 returned error can't find the container with id 062abe2db4d68cb3d06515bb20378a08ff6681318e2e1781ffc202a10ac32630 Apr 16 18:02:12.842890 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:12.842802 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod179b0e84_c4b0_49f4_af8c_88127e5c999a.slice/crio-ad242a30ff87b3ce0a7a81ba0bc13c3a7ca5244f3abdb1ae75799c779d81af0f WatchSource:0}: Error finding container ad242a30ff87b3ce0a7a81ba0bc13c3a7ca5244f3abdb1ae75799c779d81af0f: Status 404 returned error can't find the container with id ad242a30ff87b3ce0a7a81ba0bc13c3a7ca5244f3abdb1ae75799c779d81af0f Apr 16 18:02:12.843974 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:02:12.843953 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7430323d_b0e8_4d21_9d45_f29e8e9f976f.slice/crio-463c45da02d63ccfb6fd4d38ab5ef717dffb32b2ba1de5b34de18e77f7a9b434 WatchSource:0}: Error finding container 463c45da02d63ccfb6fd4d38ab5ef717dffb32b2ba1de5b34de18e77f7a9b434: Status 404 returned error can't find the container with id 463c45da02d63ccfb6fd4d38ab5ef717dffb32b2ba1de5b34de18e77f7a9b434 Apr 16 18:02:13.162589 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.162311 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:11 +0000 UTC" deadline="2027-11-03 12:47:49.138580071 +0000 UTC" Apr 16 18:02:13.162589 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.162510 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13578h45m35.976072886s" Apr 16 18:02:13.265339 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.265303 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s6vmk" event={"ID":"7430323d-b0e8-4d21-9d45-f29e8e9f976f","Type":"ContainerStarted","Data":"463c45da02d63ccfb6fd4d38ab5ef717dffb32b2ba1de5b34de18e77f7a9b434"} Apr 16 18:02:13.269980 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.269952 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lgc4z" event={"ID":"179b0e84-c4b0-49f4-af8c-88127e5c999a","Type":"ContainerStarted","Data":"ad242a30ff87b3ce0a7a81ba0bc13c3a7ca5244f3abdb1ae75799c779d81af0f"} Apr 16 18:02:13.273202 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.273176 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" event={"ID":"c11179d3-6347-4962-9a7f-b134abd01cf4","Type":"ContainerStarted","Data":"06322097be4ada68377b3deb94e57d56e19b3d66543db14468879aaced189e29"} Apr 16 18:02:13.278114 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.277341 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-55.ec2.internal" event={"ID":"eceb33926282e4ef98e3ef9d13dc120a","Type":"ContainerStarted","Data":"152b80510aceeb9ecba3b40d1f0e007d735ce624b9eb698de0bfc1dfb782e729"} Apr 16 18:02:13.281081 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.281015 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" event={"ID":"35df5907-2c78-469e-9cb6-41fb0dfca177","Type":"ContainerStarted","Data":"062abe2db4d68cb3d06515bb20378a08ff6681318e2e1781ffc202a10ac32630"} Apr 16 18:02:13.287846 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.287788 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9gw8r" event={"ID":"3b31ea4d-734b-4f90-a86a-3706e8c5d58f","Type":"ContainerStarted","Data":"65b3b6a7d316888393a04f80234ff7fd6b232a72ac1298a1edbd60051266e678"} Apr 16 18:02:13.295237 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.294070 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-55.ec2.internal" podStartSLOduration=2.294056233 podStartE2EDuration="2.294056233s" podCreationTimestamp="2026-04-16 18:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:13.293840309 +0000 UTC m=+3.597573704" watchObservedRunningTime="2026-04-16 18:02:13.294056233 +0000 UTC m=+3.597789628" Apr 16 18:02:13.297534 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.297493 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dgmt6" event={"ID":"2d9e5152-75a1-43b4-9c2b-f42acb2075df","Type":"ContainerStarted","Data":"013ec67c3c3356bb48a7f7697edefe42097498aae29b5043e87526d50905040f"} Apr 16 18:02:13.300289 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.300261 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" event={"ID":"53e0cdeb-435b-4719-b0de-3d6d77d6a1db","Type":"ContainerStarted","Data":"d476c3abda53898aa66e51a955bb45cd0a656d7d1e8b8d19873f33ab20004517"} Apr 16 18:02:13.301546 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.301516 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nc622" event={"ID":"526fd613-ae22-4771-a739-ad3226226ba3","Type":"ContainerStarted","Data":"69d02dea871296420779da3c6f338fdda721df7e4d673d22489b8f5fe6735c3e"} Apr 16 18:02:13.303371 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.303346 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-85mvt" event={"ID":"e2c3dbfa-ec70-406e-af1c-6ac3c1aba031","Type":"ContainerStarted","Data":"8f9c66a384185c5331266397edaa0f942a8a0938f76a05fe3948265273d2750d"} Apr 16 18:02:13.750415 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.750358 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:13.750658 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:13.750580 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:13.750742 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:13.750673 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs podName:e1d07798-4149-483d-a793-b43a2b14fdbf nodeName:}" failed. No retries permitted until 2026-04-16 18:02:15.750652254 +0000 UTC m=+6.054385632 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs") pod "network-metrics-daemon-vnntp" (UID: "e1d07798-4149-483d-a793-b43a2b14fdbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:13.852004 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:13.851344 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l98t\" (UniqueName: \"kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t\") pod \"network-check-target-r8w9f\" (UID: \"378c671e-1d04-4256-a001-a7ce2a0d3b86\") " pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:13.852004 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:13.851519 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:13.852004 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:13.851539 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:13.852004 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:13.851553 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8l98t for pod openshift-network-diagnostics/network-check-target-r8w9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:13.852004 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:13.851629 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t podName:378c671e-1d04-4256-a001-a7ce2a0d3b86 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:15.851591054 +0000 UTC m=+6.155324442 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8l98t" (UniqueName: "kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t") pod "network-check-target-r8w9f" (UID: "378c671e-1d04-4256-a001-a7ce2a0d3b86") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:14.251833 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:14.251422 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:14.251833 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:14.251557 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:14.251833 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:14.251659 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:14.251833 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:14.251743 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:14.316153 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:14.316114 2578 generic.go:358] "Generic (PLEG): container finished" podID="5883b1b032604264e78d8f20b1008666" containerID="aae954d4419b0dab4397c3e6beed20615508d20fb41790067998a207675bfbac" exitCode=0 Apr 16 18:02:14.317092 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:14.317063 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal" event={"ID":"5883b1b032604264e78d8f20b1008666","Type":"ContainerDied","Data":"aae954d4419b0dab4397c3e6beed20615508d20fb41790067998a207675bfbac"} Apr 16 18:02:15.339719 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:15.339675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal" event={"ID":"5883b1b032604264e78d8f20b1008666","Type":"ContainerStarted","Data":"ac05b0cb8a15bc551d0fd308e267a1712e4b651f1f826fb598101322ad4aa2c0"} Apr 16 18:02:15.354621 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:15.354551 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-55.ec2.internal" podStartSLOduration=4.354534422 podStartE2EDuration="4.354534422s" podCreationTimestamp="2026-04-16 18:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:15.354351058 +0000 UTC m=+5.658084454" watchObservedRunningTime="2026-04-16 18:02:15.354534422 +0000 UTC m=+5.658267818" Apr 16 18:02:15.767776 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:15.767686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:15.767944 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:15.767869 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:15.767944 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:15.767933 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs podName:e1d07798-4149-483d-a793-b43a2b14fdbf nodeName:}" failed. No retries permitted until 2026-04-16 18:02:19.767915391 +0000 UTC m=+10.071648770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs") pod "network-metrics-daemon-vnntp" (UID: "e1d07798-4149-483d-a793-b43a2b14fdbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:15.868957 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:15.868914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l98t\" (UniqueName: \"kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t\") pod \"network-check-target-r8w9f\" (UID: \"378c671e-1d04-4256-a001-a7ce2a0d3b86\") " pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:15.869143 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:15.869083 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:15.869143 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:15.869102 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:15.869143 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:15.869116 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8l98t for pod openshift-network-diagnostics/network-check-target-r8w9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:15.869298 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:15.869174 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t podName:378c671e-1d04-4256-a001-a7ce2a0d3b86 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:19.869155069 +0000 UTC m=+10.172888448 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8l98t" (UniqueName: "kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t") pod "network-check-target-r8w9f" (UID: "378c671e-1d04-4256-a001-a7ce2a0d3b86") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:16.253542 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:16.253469 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:16.253710 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:16.253596 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:16.253710 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:16.253640 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:16.253828 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:16.253762 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:18.250962 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:18.250927 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:18.251393 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:18.251056 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:18.253344 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:18.253319 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:18.253468 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:18.253444 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:19.805289 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:19.805246 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:19.805724 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:19.805419 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:19.805724 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:19.805497 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs podName:e1d07798-4149-483d-a793-b43a2b14fdbf nodeName:}" failed. No retries permitted until 2026-04-16 18:02:27.805476116 +0000 UTC m=+18.109209494 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs") pod "network-metrics-daemon-vnntp" (UID: "e1d07798-4149-483d-a793-b43a2b14fdbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:19.906087 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:19.906046 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l98t\" (UniqueName: \"kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t\") pod \"network-check-target-r8w9f\" (UID: \"378c671e-1d04-4256-a001-a7ce2a0d3b86\") " pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:19.906271 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:19.906225 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:19.906271 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:19.906249 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:19.906271 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:19.906263 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8l98t for pod openshift-network-diagnostics/network-check-target-r8w9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:19.906385 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:19.906312 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t podName:378c671e-1d04-4256-a001-a7ce2a0d3b86 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:27.906297046 +0000 UTC m=+18.210030420 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8l98t" (UniqueName: "kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t") pod "network-check-target-r8w9f" (UID: "378c671e-1d04-4256-a001-a7ce2a0d3b86") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:20.251875 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:20.251794 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:20.251994 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:20.251898 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:20.252265 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:20.252246 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:20.252351 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:20.252333 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:22.250785 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:22.250752 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:22.251172 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:22.250867 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:22.251172 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:22.250957 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:22.251172 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:22.251084 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:22.913524 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:22.913478 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-pwdrb"] Apr 16 18:02:22.915754 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:22.915733 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:22.915880 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:22.915812 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwdrb" podUID="9d78cda3-41b6-4a64-9426-20bac8b58ad1" Apr 16 18:02:23.032918 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:23.032885 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9d78cda3-41b6-4a64-9426-20bac8b58ad1-kubelet-config\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:23.033084 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:23.032981 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9d78cda3-41b6-4a64-9426-20bac8b58ad1-dbus\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:23.033084 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:23.033005 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:23.133550 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:23.133513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9d78cda3-41b6-4a64-9426-20bac8b58ad1-dbus\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:23.133744 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:23.133558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:23.133744 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:23.133587 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9d78cda3-41b6-4a64-9426-20bac8b58ad1-kubelet-config\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:23.133744 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:23.133688 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9d78cda3-41b6-4a64-9426-20bac8b58ad1-kubelet-config\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:23.133744 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:23.133695 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9d78cda3-41b6-4a64-9426-20bac8b58ad1-dbus\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:23.133964 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:23.133744 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:23.133964 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:23.133812 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret podName:9d78cda3-41b6-4a64-9426-20bac8b58ad1 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:23.633797415 +0000 UTC m=+13.937530788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret") pod "global-pull-secret-syncer-pwdrb" (UID: "9d78cda3-41b6-4a64-9426-20bac8b58ad1") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:23.637038 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:23.637002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:23.637480 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:23.637125 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:23.637480 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:23.637186 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret podName:9d78cda3-41b6-4a64-9426-20bac8b58ad1 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:24.637167137 +0000 UTC m=+14.940900516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret") pod "global-pull-secret-syncer-pwdrb" (UID: "9d78cda3-41b6-4a64-9426-20bac8b58ad1") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:24.250298 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:24.250264 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:24.250462 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:24.250263 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:24.250462 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:24.250386 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:24.250585 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:24.250494 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwdrb" podUID="9d78cda3-41b6-4a64-9426-20bac8b58ad1" Apr 16 18:02:24.250585 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:24.250264 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:24.250703 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:24.250654 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:24.645870 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:24.645778 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:24.646265 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:24.645959 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:24.646265 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:24.646036 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret podName:9d78cda3-41b6-4a64-9426-20bac8b58ad1 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:26.646014858 +0000 UTC m=+16.949748239 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret") pod "global-pull-secret-syncer-pwdrb" (UID: "9d78cda3-41b6-4a64-9426-20bac8b58ad1") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:26.250944 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:26.250910 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:26.250944 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:26.250936 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:26.251505 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:26.250924 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:26.251505 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:26.251025 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwdrb" podUID="9d78cda3-41b6-4a64-9426-20bac8b58ad1" Apr 16 18:02:26.251505 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:26.251093 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:26.251505 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:26.251198 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:26.660619 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:26.660523 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:26.660794 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:26.660713 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:26.660794 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:26.660791 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret podName:9d78cda3-41b6-4a64-9426-20bac8b58ad1 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:30.660769998 +0000 UTC m=+20.964503370 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret") pod "global-pull-secret-syncer-pwdrb" (UID: "9d78cda3-41b6-4a64-9426-20bac8b58ad1") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:27.870787 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:27.870751 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:27.871187 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:27.870923 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:27.871187 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:27.870988 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs podName:e1d07798-4149-483d-a793-b43a2b14fdbf nodeName:}" failed. No retries permitted until 2026-04-16 18:02:43.870971179 +0000 UTC m=+34.174704554 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs") pod "network-metrics-daemon-vnntp" (UID: "e1d07798-4149-483d-a793-b43a2b14fdbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:27.971718 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:27.971685 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l98t\" (UniqueName: \"kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t\") pod \"network-check-target-r8w9f\" (UID: \"378c671e-1d04-4256-a001-a7ce2a0d3b86\") " pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:27.971891 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:27.971862 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:27.971891 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:27.971886 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:27.971984 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:27.971901 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8l98t for pod openshift-network-diagnostics/network-check-target-r8w9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:27.971984 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:27.971963 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t podName:378c671e-1d04-4256-a001-a7ce2a0d3b86 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:43.97194536 +0000 UTC m=+34.275678736 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8l98t" (UniqueName: "kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t") pod "network-check-target-r8w9f" (UID: "378c671e-1d04-4256-a001-a7ce2a0d3b86") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:28.250741 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:28.250656 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:28.250899 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:28.250656 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:28.250899 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:28.250780 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:28.250899 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:28.250865 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwdrb" podUID="9d78cda3-41b6-4a64-9426-20bac8b58ad1" Apr 16 18:02:28.250899 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:28.250656 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:28.251138 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:28.251006 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:30.252235 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:30.252091 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:30.252235 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:30.252210 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:30.252235 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:30.252229 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:30.252621 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:30.252279 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:30.252621 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:30.252365 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwdrb" podUID="9d78cda3-41b6-4a64-9426-20bac8b58ad1" Apr 16 18:02:30.252621 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:30.252440 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:30.364526 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:30.364308 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" event={"ID":"35df5907-2c78-469e-9cb6-41fb0dfca177","Type":"ContainerStarted","Data":"85251195af7f7258a064655de2da8327c446159c33d6fd8100dc309561ef0c73"} Apr 16 18:02:30.366799 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:30.366698 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9gw8r" event={"ID":"3b31ea4d-734b-4f90-a86a-3706e8c5d58f","Type":"ContainerStarted","Data":"22b12690fa1b9cd2c0df87f1805f998a1ea408a0e3d0929554665108879635b9"} Apr 16 18:02:30.373060 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:30.372717 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" event={"ID":"53e0cdeb-435b-4719-b0de-3d6d77d6a1db","Type":"ContainerStarted","Data":"c4ec0768d2db4e579008f5f7d112e1fd8c2aea04dc999280d133b1dd7084de4d"} Apr 16 18:02:30.383862 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:30.383364 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9gw8r" podStartSLOduration=3.202022232 podStartE2EDuration="20.383346927s" podCreationTimestamp="2026-04-16 18:02:10 +0000 UTC" firstStartedPulling="2026-04-16 18:02:12.838573456 +0000 UTC m=+3.142306832" lastFinishedPulling="2026-04-16 18:02:30.019898144 +0000 UTC m=+20.323631527" observedRunningTime="2026-04-16 18:02:30.382948266 +0000 UTC m=+20.686681661" watchObservedRunningTime="2026-04-16 18:02:30.383346927 +0000 UTC m=+20.687080303" Apr 16 18:02:30.402372 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:30.402330 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-x7cz7" podStartSLOduration=3.253271073 podStartE2EDuration="20.402315364s" podCreationTimestamp="2026-04-16 18:02:10 +0000 UTC" firstStartedPulling="2026-04-16 18:02:12.836727287 +0000 UTC m=+3.140460669" lastFinishedPulling="2026-04-16 18:02:29.985771582 +0000 UTC m=+20.289504960" observedRunningTime="2026-04-16 18:02:30.401420733 +0000 UTC m=+20.705154128" watchObservedRunningTime="2026-04-16 18:02:30.402315364 +0000 UTC m=+20.706048758" Apr 16 18:02:30.692937 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:30.692903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:30.693074 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:30.693040 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:30.693120 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:30.693099 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret podName:9d78cda3-41b6-4a64-9426-20bac8b58ad1 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:38.693081606 +0000 UTC m=+28.996814979 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret") pod "global-pull-secret-syncer-pwdrb" (UID: "9d78cda3-41b6-4a64-9426-20bac8b58ad1") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:31.375789 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:31.375545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nc622" event={"ID":"526fd613-ae22-4771-a739-ad3226226ba3","Type":"ContainerStarted","Data":"db21b8c14fc7a5d2cc7317e465d287ea5ade6b4ad898ab09a05161e6d385ba42"} Apr 16 18:02:31.376923 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:31.376900 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-85mvt" event={"ID":"e2c3dbfa-ec70-406e-af1c-6ac3c1aba031","Type":"ContainerStarted","Data":"1e6ee7ca4a765bfc59529253083ea9212870ebfa8d4c41fae10da9b7d39efbe8"} Apr 16 18:02:31.378066 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:31.378033 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lgc4z" event={"ID":"179b0e84-c4b0-49f4-af8c-88127e5c999a","Type":"ContainerStarted","Data":"491db4047f9bb25af881005e075c97d37ab591b13cdc3a7ac94289d3e3921bd9"} Apr 16 18:02:31.379990 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:31.379967 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" event={"ID":"c11179d3-6347-4962-9a7f-b134abd01cf4","Type":"ContainerStarted","Data":"1e38b1330b3beda8f831c1b2bf5378f04c8e038c314460d3529c2def3f8d90aa"} Apr 16 18:02:31.379990 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:31.379994 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" event={"ID":"c11179d3-6347-4962-9a7f-b134abd01cf4","Type":"ContainerStarted","Data":"61a38e4de8e7bbd020b82e706aed3735291d6369db547386c55db4a91315a812"} Apr 16 18:02:31.380350 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:31.380003 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" event={"ID":"c11179d3-6347-4962-9a7f-b134abd01cf4","Type":"ContainerStarted","Data":"bd22ade0bf556dc471a1fe5f0b9f7ae294f1ee0fd02a3fc92dadf16aa17da62d"} Apr 16 18:02:31.380350 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:31.380011 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" event={"ID":"c11179d3-6347-4962-9a7f-b134abd01cf4","Type":"ContainerStarted","Data":"c15d12bea5abafd98aae3cc73dc4b506b66bd0339eb29a1fa6ca5cc4a03e2e96"} Apr 16 18:02:31.381264 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:31.381242 2578 generic.go:358] "Generic (PLEG): container finished" podID="2d9e5152-75a1-43b4-9c2b-f42acb2075df" containerID="35274b87ff498a65807b3911e048b3ba03ac296ec4e1b2998fc3f066e39b6fe3" exitCode=0 Apr 16 18:02:31.381331 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:31.381309 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dgmt6" event={"ID":"2d9e5152-75a1-43b4-9c2b-f42acb2075df","Type":"ContainerDied","Data":"35274b87ff498a65807b3911e048b3ba03ac296ec4e1b2998fc3f066e39b6fe3"} Apr 16 18:02:31.396582 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:31.396547 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nc622" podStartSLOduration=4.173359492 podStartE2EDuration="21.396536047s" podCreationTimestamp="2026-04-16 18:02:10 +0000 UTC" firstStartedPulling="2026-04-16 18:02:12.833254972 +0000 UTC m=+3.136988358" lastFinishedPulling="2026-04-16 18:02:30.05643154 +0000 UTC m=+20.360164913" observedRunningTime="2026-04-16 18:02:31.396506902 +0000 UTC m=+21.700240296" watchObservedRunningTime="2026-04-16 18:02:31.396536047 +0000 UTC m=+21.700269441" Apr 16 18:02:31.413402 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:31.413360 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-85mvt" podStartSLOduration=4.197868428 podStartE2EDuration="21.413350252s" podCreationTimestamp="2026-04-16 18:02:10 +0000 UTC" firstStartedPulling="2026-04-16 18:02:12.831647546 +0000 UTC m=+3.135380926" lastFinishedPulling="2026-04-16 18:02:30.047129376 +0000 UTC m=+20.350862750" observedRunningTime="2026-04-16 18:02:31.412661542 +0000 UTC m=+21.716394937" watchObservedRunningTime="2026-04-16 18:02:31.413350252 +0000 UTC m=+21.717083647" Apr 16 18:02:31.455044 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:31.454996 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lgc4z" podStartSLOduration=4.314556432 podStartE2EDuration="21.454980274s" podCreationTimestamp="2026-04-16 18:02:10 +0000 UTC" firstStartedPulling="2026-04-16 18:02:12.845349091 +0000 UTC m=+3.149082471" lastFinishedPulling="2026-04-16 18:02:29.985772925 +0000 UTC m=+20.289506313" observedRunningTime="2026-04-16 18:02:31.454581774 +0000 UTC m=+21.758315168" watchObservedRunningTime="2026-04-16 18:02:31.454980274 +0000 UTC m=+21.758713668" Apr 16 18:02:31.685519 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:31.685484 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:02:32.110161 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:32.110128 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9gw8r" Apr 16 18:02:32.190775 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:32.190549 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:02:31.685504691Z","UUID":"e15399de-ecc6-4173-b165-4c838b0d3bbe","Handler":null,"Name":"","Endpoint":""} Apr 16 18:02:32.192363 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:32.192340 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:02:32.192363 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:32.192368 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:02:32.250853 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:32.250829 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:32.250853 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:32.250851 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:32.251078 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:32.250859 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:32.251078 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:32.250946 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:32.251078 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:32.251030 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:32.251239 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:32.251115 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwdrb" podUID="9d78cda3-41b6-4a64-9426-20bac8b58ad1" Apr 16 18:02:32.385935 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:32.385832 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s6vmk" event={"ID":"7430323d-b0e8-4d21-9d45-f29e8e9f976f","Type":"ContainerStarted","Data":"e6719d3ea1a8f63948ff6b7bcf7df55f82fed2dadd9e66702442abfcb553433c"} Apr 16 18:02:32.389983 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:32.389952 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" event={"ID":"c11179d3-6347-4962-9a7f-b134abd01cf4","Type":"ContainerStarted","Data":"82f2ab12da303525542092bd1178c727a7cc73d43bc3e41bc899924bfa981aab"} Apr 16 18:02:32.390121 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:32.389990 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" event={"ID":"c11179d3-6347-4962-9a7f-b134abd01cf4","Type":"ContainerStarted","Data":"e7d499fab0bb198c144f935906c00c876db95995bc230fff81623547a28da9e6"} Apr 16 18:02:32.393527 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:32.393481 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" event={"ID":"35df5907-2c78-469e-9cb6-41fb0dfca177","Type":"ContainerStarted","Data":"e4a837e554977d7f924bee4335a2a0d1787872d1ef4400849fb999edc1361832"} Apr 16 18:02:32.408247 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:32.408189 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-s6vmk" podStartSLOduration=5.202661734 podStartE2EDuration="22.408174434s" podCreationTimestamp="2026-04-16 18:02:10 +0000 UTC" firstStartedPulling="2026-04-16 18:02:12.845810934 +0000 UTC m=+3.149544308" lastFinishedPulling="2026-04-16 18:02:30.051323634 +0000 UTC m=+20.355057008" observedRunningTime="2026-04-16 18:02:32.407563948 +0000 UTC m=+22.711297353" watchObservedRunningTime="2026-04-16 18:02:32.408174434 +0000 UTC m=+22.711907828" Apr 16 18:02:34.250582 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:34.250407 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:34.250990 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:34.250407 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:34.250990 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:34.250407 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:34.250990 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:34.250692 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:34.250990 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:34.250789 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwdrb" podUID="9d78cda3-41b6-4a64-9426-20bac8b58ad1" Apr 16 18:02:34.250990 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:34.250882 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:34.401499 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:34.401460 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" event={"ID":"c11179d3-6347-4962-9a7f-b134abd01cf4","Type":"ContainerStarted","Data":"55ffbc53fde322300146c9bb15018864d6db61ccdc8e8750980b6236ebbfe9cf"} Apr 16 18:02:34.403653 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:34.403624 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" event={"ID":"35df5907-2c78-469e-9cb6-41fb0dfca177","Type":"ContainerStarted","Data":"2def665694d5c733253d79f9fa542141a6658e2044014816a7014b64fb353349"} Apr 16 18:02:34.456832 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:34.456784 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hvvf" podStartSLOduration=4.047395596 podStartE2EDuration="24.456770131s" podCreationTimestamp="2026-04-16 18:02:10 +0000 UTC" firstStartedPulling="2026-04-16 18:02:12.843618155 +0000 UTC m=+3.147351538" lastFinishedPulling="2026-04-16 18:02:33.252992698 +0000 UTC m=+23.556726073" observedRunningTime="2026-04-16 18:02:34.456050946 +0000 UTC m=+24.759784340" watchObservedRunningTime="2026-04-16 18:02:34.456770131 +0000 UTC m=+24.760503542" Apr 16 18:02:34.847719 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:34.847675 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9gw8r" Apr 16 18:02:34.848409 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:34.848377 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9gw8r" Apr 16 18:02:35.406163 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:35.406134 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9gw8r" Apr 16 18:02:36.251134 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:36.251105 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:36.251276 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:36.251107 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:36.251322 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:36.251286 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:36.251322 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:36.251204 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwdrb" podUID="9d78cda3-41b6-4a64-9426-20bac8b58ad1" Apr 16 18:02:36.251322 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:36.251114 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:36.251441 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:36.251363 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:36.411220 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:36.411177 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" event={"ID":"c11179d3-6347-4962-9a7f-b134abd01cf4","Type":"ContainerStarted","Data":"74d12c07479491aee4f4ebe0fbd861e3b65afedad0f7ba0233f8f26aea8f1b1d"} Apr 16 18:02:36.412127 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:36.411664 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:36.412127 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:36.411688 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:36.413030 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:36.413003 2578 generic.go:358] "Generic (PLEG): container finished" podID="2d9e5152-75a1-43b4-9c2b-f42acb2075df" containerID="9cd156412a3b916c24a29292f7f19329cfddfe5c017145b54e91cc7f4e297d49" exitCode=0 Apr 16 18:02:36.413135 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:36.413082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dgmt6" event={"ID":"2d9e5152-75a1-43b4-9c2b-f42acb2075df","Type":"ContainerDied","Data":"9cd156412a3b916c24a29292f7f19329cfddfe5c017145b54e91cc7f4e297d49"} Apr 16 18:02:36.427050 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:36.427029 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:36.483392 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:36.483350 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" podStartSLOduration=8.613142755 podStartE2EDuration="26.483335393s" podCreationTimestamp="2026-04-16 18:02:10 +0000 UTC" firstStartedPulling="2026-04-16 18:02:12.840550537 +0000 UTC m=+3.144283911" lastFinishedPulling="2026-04-16 18:02:30.710743177 +0000 UTC m=+21.014476549" observedRunningTime="2026-04-16 18:02:36.452588484 +0000 UTC m=+26.756321878" watchObservedRunningTime="2026-04-16 18:02:36.483335393 +0000 UTC m=+26.787068796" Apr 16 18:02:37.418348 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:37.418111 2578 generic.go:358] "Generic (PLEG): container finished" podID="2d9e5152-75a1-43b4-9c2b-f42acb2075df" containerID="b1a22c690a8e405a86bf30b7332542eaf1b870f9e81818dcf52c1f5f9db78cea" exitCode=0 Apr 16 18:02:37.418758 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:37.418189 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dgmt6" event={"ID":"2d9e5152-75a1-43b4-9c2b-f42acb2075df","Type":"ContainerDied","Data":"b1a22c690a8e405a86bf30b7332542eaf1b870f9e81818dcf52c1f5f9db78cea"} Apr 16 18:02:37.418973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:37.418904 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:37.433419 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:37.433396 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:02:37.907827 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:37.907795 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vnntp"] Apr 16 18:02:37.908006 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:37.907935 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:37.908051 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:37.908031 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:37.911160 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:37.911133 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-r8w9f"] Apr 16 18:02:37.911286 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:37.911242 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:37.911391 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:37.911365 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:37.911923 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:37.911899 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pwdrb"] Apr 16 18:02:37.912000 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:37.911989 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:37.912120 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:37.912071 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwdrb" podUID="9d78cda3-41b6-4a64-9426-20bac8b58ad1" Apr 16 18:02:38.422362 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:38.422328 2578 generic.go:358] "Generic (PLEG): container finished" podID="2d9e5152-75a1-43b4-9c2b-f42acb2075df" containerID="6a16f5143783a52eee5976e339c0a5a319d6f000c6c1c613826b6d75093f0ef8" exitCode=0 Apr 16 18:02:38.422813 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:38.422393 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dgmt6" event={"ID":"2d9e5152-75a1-43b4-9c2b-f42acb2075df","Type":"ContainerDied","Data":"6a16f5143783a52eee5976e339c0a5a319d6f000c6c1c613826b6d75093f0ef8"} Apr 16 18:02:38.750132 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:38.750100 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:38.750282 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:38.750207 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:38.750282 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:38.750257 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret podName:9d78cda3-41b6-4a64-9426-20bac8b58ad1 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:54.75024494 +0000 UTC m=+45.053978317 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret") pod "global-pull-secret-syncer-pwdrb" (UID: "9d78cda3-41b6-4a64-9426-20bac8b58ad1") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:39.250744 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:39.250711 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:39.250923 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:39.250861 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:40.254938 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:40.254498 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:40.254938 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:40.254622 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:40.254938 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:40.254675 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:40.254938 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:40.254756 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwdrb" podUID="9d78cda3-41b6-4a64-9426-20bac8b58ad1" Apr 16 18:02:41.251239 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:41.251198 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:41.251426 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:41.251356 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:42.250593 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:42.250557 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:42.251048 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:42.250686 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8w9f" podUID="378c671e-1d04-4256-a001-a7ce2a0d3b86" Apr 16 18:02:42.251048 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:42.250862 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:42.251048 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:42.250971 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwdrb" podUID="9d78cda3-41b6-4a64-9426-20bac8b58ad1" Apr 16 18:02:43.250881 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:43.250648 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:43.251425 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:43.250999 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:02:43.888249 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:43.888214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:43.888434 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:43.888407 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:43.888502 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:43.888489 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs podName:e1d07798-4149-483d-a793-b43a2b14fdbf nodeName:}" failed. No retries permitted until 2026-04-16 18:03:15.888466057 +0000 UTC m=+66.192199449 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs") pod "network-metrics-daemon-vnntp" (UID: "e1d07798-4149-483d-a793-b43a2b14fdbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:43.989214 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:43.989184 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l98t\" (UniqueName: \"kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t\") pod \"network-check-target-r8w9f\" (UID: \"378c671e-1d04-4256-a001-a7ce2a0d3b86\") " pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:43.989441 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:43.989342 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:43.989441 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:43.989366 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:43.989441 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:43.989378 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8l98t for pod openshift-network-diagnostics/network-check-target-r8w9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:43.989441 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:43.989431 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t podName:378c671e-1d04-4256-a001-a7ce2a0d3b86 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:15.989417206 +0000 UTC m=+66.293150584 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8l98t" (UniqueName: "kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t") pod "network-check-target-r8w9f" (UID: "378c671e-1d04-4256-a001-a7ce2a0d3b86") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:44.001138 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.001121 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-55.ec2.internal" event="NodeReady" Apr 16 18:02:44.001262 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.001251 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:02:44.085582 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.085550 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-l5w8p"] Apr 16 18:02:44.093144 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.093120 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kf467"] Apr 16 18:02:44.093277 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.093259 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:44.099414 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.099388 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:02:44.099548 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.099455 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:02:44.099548 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.099455 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-j5p9z\"" Apr 16 18:02:44.109249 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.109219 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kf467"] Apr 16 18:02:44.109374 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.109361 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:02:44.111821 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.111792 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l5w8p"] Apr 16 18:02:44.118092 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.118072 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:02:44.118209 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.118121 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:02:44.118209 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.118149 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:02:44.118381 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.118367 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dxq2x\"" Apr 16 18:02:44.190932 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.190868 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf5f2e9a-c558-4494-9601-e1ad83f7056e-config-volume\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:44.190932 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.190895 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bf5f2e9a-c558-4494-9601-e1ad83f7056e-tmp-dir\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:44.190932 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.190919 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmwmb\" (UniqueName: \"kubernetes.io/projected/bf5f2e9a-c558-4494-9601-e1ad83f7056e-kube-api-access-wmwmb\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:44.191151 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.190962 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:02:44.191151 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.190979 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:44.191151 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.191000 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7l75\" (UniqueName: \"kubernetes.io/projected/b360b9d8-466f-4200-b65e-ee3078c85b2f-kube-api-access-s7l75\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:02:44.250227 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.250203 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:44.250388 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.250216 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:02:44.253435 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.253413 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:02:44.253899 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.253449 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:02:44.253899 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.253537 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:02:44.253899 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.253797 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fs95d\"" Apr 16 18:02:44.292223 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.292198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmwmb\" (UniqueName: \"kubernetes.io/projected/bf5f2e9a-c558-4494-9601-e1ad83f7056e-kube-api-access-wmwmb\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:44.292357 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.292233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:02:44.292357 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.292250 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:44.292357 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.292274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7l75\" (UniqueName: \"kubernetes.io/projected/b360b9d8-466f-4200-b65e-ee3078c85b2f-kube-api-access-s7l75\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:02:44.292506 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.292366 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf5f2e9a-c558-4494-9601-e1ad83f7056e-config-volume\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:44.292506 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:44.292368 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:44.292506 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.292390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bf5f2e9a-c558-4494-9601-e1ad83f7056e-tmp-dir\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:44.292506 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:44.292435 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls podName:bf5f2e9a-c558-4494-9601-e1ad83f7056e nodeName:}" failed. No retries permitted until 2026-04-16 18:02:44.79241588 +0000 UTC m=+35.096149253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls") pod "dns-default-l5w8p" (UID: "bf5f2e9a-c558-4494-9601-e1ad83f7056e") : secret "dns-default-metrics-tls" not found Apr 16 18:02:44.292506 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:44.292370 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:44.292506 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:44.292509 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert podName:b360b9d8-466f-4200-b65e-ee3078c85b2f nodeName:}" failed. No retries permitted until 2026-04-16 18:02:44.792490468 +0000 UTC m=+35.096223857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert") pod "ingress-canary-kf467" (UID: "b360b9d8-466f-4200-b65e-ee3078c85b2f") : secret "canary-serving-cert" not found Apr 16 18:02:44.292801 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.292754 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bf5f2e9a-c558-4494-9601-e1ad83f7056e-tmp-dir\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:44.292948 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.292929 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf5f2e9a-c558-4494-9601-e1ad83f7056e-config-volume\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:44.305020 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.304998 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmwmb\" (UniqueName: \"kubernetes.io/projected/bf5f2e9a-c558-4494-9601-e1ad83f7056e-kube-api-access-wmwmb\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:44.309614 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.309577 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7l75\" (UniqueName: \"kubernetes.io/projected/b360b9d8-466f-4200-b65e-ee3078c85b2f-kube-api-access-s7l75\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:02:44.795373 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.795338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:02:44.795572 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:44.795380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:44.795572 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:44.795490 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:44.795572 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:44.795517 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:44.795572 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:44.795561 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert podName:b360b9d8-466f-4200-b65e-ee3078c85b2f nodeName:}" failed. No retries permitted until 2026-04-16 18:02:45.795545076 +0000 UTC m=+36.099278454 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert") pod "ingress-canary-kf467" (UID: "b360b9d8-466f-4200-b65e-ee3078c85b2f") : secret "canary-serving-cert" not found Apr 16 18:02:44.795572 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:44.795574 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls podName:bf5f2e9a-c558-4494-9601-e1ad83f7056e nodeName:}" failed. No retries permitted until 2026-04-16 18:02:45.795568663 +0000 UTC m=+36.099302037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls") pod "dns-default-l5w8p" (UID: "bf5f2e9a-c558-4494-9601-e1ad83f7056e") : secret "dns-default-metrics-tls" not found Apr 16 18:02:45.250730 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:45.250692 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:02:45.253783 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:45.253763 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:02:45.254321 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:45.253769 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jt7dn\"" Apr 16 18:02:45.439136 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:45.439102 2578 generic.go:358] "Generic (PLEG): container finished" podID="2d9e5152-75a1-43b4-9c2b-f42acb2075df" containerID="b3809d16fdca061a0824bdee7fdaa285b09e97a1e6cd090f329fffe6d7045675" exitCode=0 Apr 16 18:02:45.439311 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:45.439150 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dgmt6" event={"ID":"2d9e5152-75a1-43b4-9c2b-f42acb2075df","Type":"ContainerDied","Data":"b3809d16fdca061a0824bdee7fdaa285b09e97a1e6cd090f329fffe6d7045675"} Apr 16 18:02:45.802943 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:45.802852 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:02:45.802943 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:45.802892 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:45.803117 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:45.802986 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:45.803117 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:45.803025 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:45.803117 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:45.803051 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert podName:b360b9d8-466f-4200-b65e-ee3078c85b2f nodeName:}" failed. No retries permitted until 2026-04-16 18:02:47.803033986 +0000 UTC m=+38.106767359 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert") pod "ingress-canary-kf467" (UID: "b360b9d8-466f-4200-b65e-ee3078c85b2f") : secret "canary-serving-cert" not found Apr 16 18:02:45.803117 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:45.803069 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls podName:bf5f2e9a-c558-4494-9601-e1ad83f7056e nodeName:}" failed. No retries permitted until 2026-04-16 18:02:47.803057639 +0000 UTC m=+38.106791012 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls") pod "dns-default-l5w8p" (UID: "bf5f2e9a-c558-4494-9601-e1ad83f7056e") : secret "dns-default-metrics-tls" not found Apr 16 18:02:46.445322 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:46.445291 2578 generic.go:358] "Generic (PLEG): container finished" podID="2d9e5152-75a1-43b4-9c2b-f42acb2075df" containerID="fdbaa5e5a6ba055adaa5a4025dcfc66471ca159d182000b654554d0cc4e9b963" exitCode=0 Apr 16 18:02:46.445772 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:46.445350 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dgmt6" event={"ID":"2d9e5152-75a1-43b4-9c2b-f42acb2075df","Type":"ContainerDied","Data":"fdbaa5e5a6ba055adaa5a4025dcfc66471ca159d182000b654554d0cc4e9b963"} Apr 16 18:02:47.449535 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:47.449512 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dgmt6" event={"ID":"2d9e5152-75a1-43b4-9c2b-f42acb2075df","Type":"ContainerStarted","Data":"c98c0eed359cdf8a1078526eff7ea57badbae042c522e0686bc66d6fe629f9bf"} Apr 16 18:02:47.476711 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:47.476663 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dgmt6" podStartSLOduration=6.018585683 podStartE2EDuration="37.476645668s" podCreationTimestamp="2026-04-16 18:02:10 +0000 UTC" firstStartedPulling="2026-04-16 18:02:12.843131694 +0000 UTC m=+3.146865078" lastFinishedPulling="2026-04-16 18:02:44.301191687 +0000 UTC m=+34.604925063" observedRunningTime="2026-04-16 18:02:47.475973652 +0000 UTC m=+37.779707049" watchObservedRunningTime="2026-04-16 18:02:47.476645668 +0000 UTC m=+37.780379065" Apr 16 18:02:47.816111 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:47.816070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:02:47.816111 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:47.816115 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:47.816318 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:47.816214 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:47.816318 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:47.816229 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:47.816318 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:47.816284 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert podName:b360b9d8-466f-4200-b65e-ee3078c85b2f nodeName:}" failed. No retries permitted until 2026-04-16 18:02:51.816265556 +0000 UTC m=+42.119998929 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert") pod "ingress-canary-kf467" (UID: "b360b9d8-466f-4200-b65e-ee3078c85b2f") : secret "canary-serving-cert" not found Apr 16 18:02:47.816318 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:47.816298 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls podName:bf5f2e9a-c558-4494-9601-e1ad83f7056e nodeName:}" failed. No retries permitted until 2026-04-16 18:02:51.816292168 +0000 UTC m=+42.120025541 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls") pod "dns-default-l5w8p" (UID: "bf5f2e9a-c558-4494-9601-e1ad83f7056e") : secret "dns-default-metrics-tls" not found Apr 16 18:02:51.845822 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:51.845767 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:02:51.845822 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:51.845817 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:51.846253 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:51.845932 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:51.846253 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:51.845951 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:51.846253 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:51.845998 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert podName:b360b9d8-466f-4200-b65e-ee3078c85b2f nodeName:}" failed. No retries permitted until 2026-04-16 18:02:59.845981138 +0000 UTC m=+50.149714511 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert") pod "ingress-canary-kf467" (UID: "b360b9d8-466f-4200-b65e-ee3078c85b2f") : secret "canary-serving-cert" not found Apr 16 18:02:51.846253 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:51.846013 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls podName:bf5f2e9a-c558-4494-9601-e1ad83f7056e nodeName:}" failed. No retries permitted until 2026-04-16 18:02:59.846007657 +0000 UTC m=+50.149741030 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls") pod "dns-default-l5w8p" (UID: "bf5f2e9a-c558-4494-9601-e1ad83f7056e") : secret "dns-default-metrics-tls" not found Apr 16 18:02:54.768620 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:54.768572 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:54.771302 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:54.771283 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d78cda3-41b6-4a64-9426-20bac8b58ad1-original-pull-secret\") pod \"global-pull-secret-syncer-pwdrb\" (UID: \"9d78cda3-41b6-4a64-9426-20bac8b58ad1\") " pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:55.059919 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:55.059842 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwdrb" Apr 16 18:02:55.255578 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:55.255542 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pwdrb"] Apr 16 18:02:55.463955 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:55.463866 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pwdrb" event={"ID":"9d78cda3-41b6-4a64-9426-20bac8b58ad1","Type":"ContainerStarted","Data":"d7cf5f49b999cdd21d083a18837445bda07cc616b7d7adddea6f9ca44d3ecf47"} Apr 16 18:02:59.917544 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:59.917449 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:02:59.917544 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:02:59.917486 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:02:59.917996 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:59.917579 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:59.917996 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:59.917622 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:59.917996 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:59.917648 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls podName:bf5f2e9a-c558-4494-9601-e1ad83f7056e nodeName:}" failed. No retries permitted until 2026-04-16 18:03:15.917634038 +0000 UTC m=+66.221367411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls") pod "dns-default-l5w8p" (UID: "bf5f2e9a-c558-4494-9601-e1ad83f7056e") : secret "dns-default-metrics-tls" not found Apr 16 18:02:59.917996 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:02:59.917686 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert podName:b360b9d8-466f-4200-b65e-ee3078c85b2f nodeName:}" failed. No retries permitted until 2026-04-16 18:03:15.91766749 +0000 UTC m=+66.221400894 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert") pod "ingress-canary-kf467" (UID: "b360b9d8-466f-4200-b65e-ee3078c85b2f") : secret "canary-serving-cert" not found Apr 16 18:03:00.475620 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:00.475564 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pwdrb" event={"ID":"9d78cda3-41b6-4a64-9426-20bac8b58ad1","Type":"ContainerStarted","Data":"4d786b0cce0d1dc8cac8924fbb74b9332354e46a3182c862c6dbd1964f70c3e5"} Apr 16 18:03:00.494468 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:00.494418 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pwdrb" podStartSLOduration=34.096258014 podStartE2EDuration="38.494400383s" podCreationTimestamp="2026-04-16 18:02:22 +0000 UTC" firstStartedPulling="2026-04-16 18:02:55.261102305 +0000 UTC m=+45.564835680" lastFinishedPulling="2026-04-16 18:02:59.659244666 +0000 UTC m=+49.962978049" observedRunningTime="2026-04-16 18:03:00.493860375 +0000 UTC m=+50.797593769" watchObservedRunningTime="2026-04-16 18:03:00.494400383 +0000 UTC m=+50.798133778" Apr 16 18:03:02.472552 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.472505 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s"] Apr 16 18:03:02.514227 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.514196 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s"] Apr 16 18:03:02.514227 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.514226 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.519801 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.519772 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 18:03:02.519801 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.519796 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 18:03:02.520050 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.520029 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 18:03:02.520097 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.520068 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 18:03:02.521912 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.521894 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 18:03:02.522068 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.522043 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 18:03:02.522175 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.522116 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 18:03:02.537504 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.537486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ddae9640-751b-43e0-afe8-180c52b1a0ce-ca\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.537616 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.537510 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ddae9640-751b-43e0-afe8-180c52b1a0ce-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.537616 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.537528 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ddae9640-751b-43e0-afe8-180c52b1a0ce-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.537616 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.537546 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ddae9640-751b-43e0-afe8-180c52b1a0ce-hub\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.537616 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.537594 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ddae9640-751b-43e0-afe8-180c52b1a0ce-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.537755 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.537668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmrjf\" (UniqueName: \"kubernetes.io/projected/ddae9640-751b-43e0-afe8-180c52b1a0ce-kube-api-access-pmrjf\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.638678 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.638650 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ddae9640-751b-43e0-afe8-180c52b1a0ce-hub\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.638797 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.638683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ddae9640-751b-43e0-afe8-180c52b1a0ce-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.638797 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.638703 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmrjf\" (UniqueName: \"kubernetes.io/projected/ddae9640-751b-43e0-afe8-180c52b1a0ce-kube-api-access-pmrjf\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.638907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.638801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ddae9640-751b-43e0-afe8-180c52b1a0ce-ca\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.638907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.638827 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ddae9640-751b-43e0-afe8-180c52b1a0ce-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.638907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.638849 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ddae9640-751b-43e0-afe8-180c52b1a0ce-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.639766 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.639740 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ddae9640-751b-43e0-afe8-180c52b1a0ce-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.641405 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.641381 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ddae9640-751b-43e0-afe8-180c52b1a0ce-hub\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.641546 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.641435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ddae9640-751b-43e0-afe8-180c52b1a0ce-ca\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.641546 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.641465 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ddae9640-751b-43e0-afe8-180c52b1a0ce-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.641546 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.641488 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ddae9640-751b-43e0-afe8-180c52b1a0ce-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.648930 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.648906 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmrjf\" (UniqueName: \"kubernetes.io/projected/ddae9640-751b-43e0-afe8-180c52b1a0ce-kube-api-access-pmrjf\") pod \"cluster-proxy-proxy-agent-5f859c47df-tsh8s\" (UID: \"ddae9640-751b-43e0-afe8-180c52b1a0ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.834511 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.834480 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:03:02.963828 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:02.963796 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s"] Apr 16 18:03:02.966658 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:03:02.966631 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddae9640_751b_43e0_afe8_180c52b1a0ce.slice/crio-3c4c6950e2cec8e8764fcba2a0f43b73494f0ebd06fd18f6dfb7bdeed64ab763 WatchSource:0}: Error finding container 3c4c6950e2cec8e8764fcba2a0f43b73494f0ebd06fd18f6dfb7bdeed64ab763: Status 404 returned error can't find the container with id 3c4c6950e2cec8e8764fcba2a0f43b73494f0ebd06fd18f6dfb7bdeed64ab763 Apr 16 18:03:03.481678 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:03.481642 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" event={"ID":"ddae9640-751b-43e0-afe8-180c52b1a0ce","Type":"ContainerStarted","Data":"3c4c6950e2cec8e8764fcba2a0f43b73494f0ebd06fd18f6dfb7bdeed64ab763"} Apr 16 18:03:06.488447 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:06.488416 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" event={"ID":"ddae9640-751b-43e0-afe8-180c52b1a0ce","Type":"ContainerStarted","Data":"f4e3d0b69b3a5731927ac40e2d4745795c2e99922ca1f317c733b8da7d4bc253"} Apr 16 18:03:08.494910 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:08.494877 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" event={"ID":"ddae9640-751b-43e0-afe8-180c52b1a0ce","Type":"ContainerStarted","Data":"364352ef001efe1da543418b53ee4ebd0053516d7a73ecd28b2c56accfbdfcc5"} Apr 16 18:03:08.495256 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:08.494919 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" event={"ID":"ddae9640-751b-43e0-afe8-180c52b1a0ce","Type":"ContainerStarted","Data":"7808025a28ed820a434ee0c4aa64f6b516abca1706c5466bd664e0eb4143a1e0"} Apr 16 18:03:08.514805 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:08.514760 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" podStartSLOduration=1.156544751 podStartE2EDuration="6.514746979s" podCreationTimestamp="2026-04-16 18:03:02 +0000 UTC" firstStartedPulling="2026-04-16 18:03:02.968454841 +0000 UTC m=+53.272188218" lastFinishedPulling="2026-04-16 18:03:08.326657073 +0000 UTC m=+58.630390446" observedRunningTime="2026-04-16 18:03:08.513321218 +0000 UTC m=+58.817054612" watchObservedRunningTime="2026-04-16 18:03:08.514746979 +0000 UTC m=+58.818480374" Apr 16 18:03:09.435810 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:09.435783 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r77l7" Apr 16 18:03:15.934899 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:15.934863 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:03:15.934899 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:15.934900 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:03:15.935413 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:15.934927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:03:15.935413 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:03:15.935015 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:15.935413 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:03:15.935025 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:15.935413 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:03:15.935085 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert podName:b360b9d8-466f-4200-b65e-ee3078c85b2f nodeName:}" failed. No retries permitted until 2026-04-16 18:03:47.935068543 +0000 UTC m=+98.238801916 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert") pod "ingress-canary-kf467" (UID: "b360b9d8-466f-4200-b65e-ee3078c85b2f") : secret "canary-serving-cert" not found Apr 16 18:03:15.935413 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:03:15.935101 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls podName:bf5f2e9a-c558-4494-9601-e1ad83f7056e nodeName:}" failed. No retries permitted until 2026-04-16 18:03:47.935093992 +0000 UTC m=+98.238827366 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls") pod "dns-default-l5w8p" (UID: "bf5f2e9a-c558-4494-9601-e1ad83f7056e") : secret "dns-default-metrics-tls" not found Apr 16 18:03:15.937750 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:15.937733 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:03:15.945952 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:03:15.945928 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:03:15.946014 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:03:15.945980 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs podName:e1d07798-4149-483d-a793-b43a2b14fdbf nodeName:}" failed. No retries permitted until 2026-04-16 18:04:19.945966159 +0000 UTC m=+130.249699532 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs") pod "network-metrics-daemon-vnntp" (UID: "e1d07798-4149-483d-a793-b43a2b14fdbf") : secret "metrics-daemon-secret" not found Apr 16 18:03:16.035269 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:16.035235 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l98t\" (UniqueName: \"kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t\") pod \"network-check-target-r8w9f\" (UID: \"378c671e-1d04-4256-a001-a7ce2a0d3b86\") " pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:03:16.038142 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:16.038126 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:03:16.048830 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:16.048808 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:03:16.059251 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:16.059228 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l98t\" (UniqueName: \"kubernetes.io/projected/378c671e-1d04-4256-a001-a7ce2a0d3b86-kube-api-access-8l98t\") pod \"network-check-target-r8w9f\" (UID: \"378c671e-1d04-4256-a001-a7ce2a0d3b86\") " pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:03:16.069441 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:16.069422 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fs95d\"" Apr 16 18:03:16.077953 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:16.077929 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:03:16.188664 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:16.188576 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-r8w9f"] Apr 16 18:03:16.191681 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:03:16.191655 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod378c671e_1d04_4256_a001_a7ce2a0d3b86.slice/crio-349e5e4a16749b96d8a7b4fde65349b5932dac01041b5a2229eb22ce12d91aab WatchSource:0}: Error finding container 349e5e4a16749b96d8a7b4fde65349b5932dac01041b5a2229eb22ce12d91aab: Status 404 returned error can't find the container with id 349e5e4a16749b96d8a7b4fde65349b5932dac01041b5a2229eb22ce12d91aab Apr 16 18:03:16.512435 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:16.512393 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-r8w9f" event={"ID":"378c671e-1d04-4256-a001-a7ce2a0d3b86","Type":"ContainerStarted","Data":"349e5e4a16749b96d8a7b4fde65349b5932dac01041b5a2229eb22ce12d91aab"} Apr 16 18:03:19.520656 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:19.520611 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-r8w9f" event={"ID":"378c671e-1d04-4256-a001-a7ce2a0d3b86","Type":"ContainerStarted","Data":"a2b52ec99da164e7ef54a29b14782a9596f2f901724bca418d09897a394c7058"} Apr 16 18:03:19.521088 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:19.520731 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:03:19.538323 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:19.538274 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-r8w9f" podStartSLOduration=66.771477183 podStartE2EDuration="1m9.538261075s" podCreationTimestamp="2026-04-16 18:02:10 +0000 UTC" firstStartedPulling="2026-04-16 18:03:16.193471028 +0000 UTC m=+66.497204401" lastFinishedPulling="2026-04-16 18:03:18.960254917 +0000 UTC m=+69.263988293" observedRunningTime="2026-04-16 18:03:19.537267282 +0000 UTC m=+69.841000677" watchObservedRunningTime="2026-04-16 18:03:19.538261075 +0000 UTC m=+69.841994502" Apr 16 18:03:47.949818 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:47.949768 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:03:47.949818 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:47.949818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:03:47.950318 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:03:47.949914 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:47.950318 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:03:47.949914 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:47.950318 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:03:47.949965 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls podName:bf5f2e9a-c558-4494-9601-e1ad83f7056e nodeName:}" failed. No retries permitted until 2026-04-16 18:04:51.949952078 +0000 UTC m=+162.253685452 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls") pod "dns-default-l5w8p" (UID: "bf5f2e9a-c558-4494-9601-e1ad83f7056e") : secret "dns-default-metrics-tls" not found Apr 16 18:03:47.950318 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:03:47.949979 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert podName:b360b9d8-466f-4200-b65e-ee3078c85b2f nodeName:}" failed. No retries permitted until 2026-04-16 18:04:51.949972648 +0000 UTC m=+162.253706020 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert") pod "ingress-canary-kf467" (UID: "b360b9d8-466f-4200-b65e-ee3078c85b2f") : secret "canary-serving-cert" not found Apr 16 18:03:50.525454 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:03:50.525419 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-r8w9f" Apr 16 18:04:19.967352 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:19.967313 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:04:19.967817 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:19.967419 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:04:19.967817 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:19.967478 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs podName:e1d07798-4149-483d-a793-b43a2b14fdbf nodeName:}" failed. No retries permitted until 2026-04-16 18:06:21.967464571 +0000 UTC m=+252.271197944 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs") pod "network-metrics-daemon-vnntp" (UID: "e1d07798-4149-483d-a793-b43a2b14fdbf") : secret "metrics-daemon-secret" not found Apr 16 18:04:39.945978 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:39.945947 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-9b8648dc-vjsvl"] Apr 16 18:04:39.948734 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:39.948717 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:39.953347 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:39.953326 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:04:39.956064 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:39.956047 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:04:39.956595 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:39.956580 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:04:39.957040 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:39.957028 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jwfwh\"" Apr 16 18:04:39.965095 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:39.965076 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:04:39.980421 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:39.980397 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9b8648dc-vjsvl"] Apr 16 18:04:40.008870 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.008837 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-bound-sa-token\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.009011 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.008898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-certificates\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.009011 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.008944 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad99f7e5-4161-446e-ad87-82c8c5677f5f-trusted-ca\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.009011 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.008980 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad99f7e5-4161-446e-ad87-82c8c5677f5f-image-registry-private-configuration\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.009147 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.009028 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad99f7e5-4161-446e-ad87-82c8c5677f5f-installation-pull-secrets\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.009147 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.009049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.009147 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.009066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad99f7e5-4161-446e-ad87-82c8c5677f5f-ca-trust-extracted\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.009147 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.009088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xflm7\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-kube-api-access-xflm7\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.109821 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.109782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-certificates\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.109821 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.109818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad99f7e5-4161-446e-ad87-82c8c5677f5f-trusted-ca\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.110050 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.109846 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad99f7e5-4161-446e-ad87-82c8c5677f5f-image-registry-private-configuration\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.110050 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.109902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad99f7e5-4161-446e-ad87-82c8c5677f5f-installation-pull-secrets\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.110050 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.109928 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.110050 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.109952 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad99f7e5-4161-446e-ad87-82c8c5677f5f-ca-trust-extracted\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.110050 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.109976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xflm7\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-kube-api-access-xflm7\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.110050 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.109999 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-bound-sa-token\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.110326 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:40.110069 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:04:40.110326 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:40.110097 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9b8648dc-vjsvl: secret "image-registry-tls" not found Apr 16 18:04:40.110326 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:40.110187 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls podName:ad99f7e5-4161-446e-ad87-82c8c5677f5f nodeName:}" failed. No retries permitted until 2026-04-16 18:04:40.610164487 +0000 UTC m=+150.913897874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls") pod "image-registry-9b8648dc-vjsvl" (UID: "ad99f7e5-4161-446e-ad87-82c8c5677f5f") : secret "image-registry-tls" not found Apr 16 18:04:40.110486 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.110468 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-certificates\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.110778 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.110647 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad99f7e5-4161-446e-ad87-82c8c5677f5f-ca-trust-extracted\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.110778 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.110769 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad99f7e5-4161-446e-ad87-82c8c5677f5f-trusted-ca\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.112476 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.112459 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad99f7e5-4161-446e-ad87-82c8c5677f5f-image-registry-private-configuration\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.112573 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.112527 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad99f7e5-4161-446e-ad87-82c8c5677f5f-installation-pull-secrets\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.124357 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.124336 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-bound-sa-token\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.125743 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.125726 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xflm7\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-kube-api-access-xflm7\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.614550 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:40.614490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:40.614752 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:40.614680 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:04:40.614752 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:40.614699 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9b8648dc-vjsvl: secret "image-registry-tls" not found Apr 16 18:04:40.614825 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:40.614766 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls podName:ad99f7e5-4161-446e-ad87-82c8c5677f5f nodeName:}" failed. No retries permitted until 2026-04-16 18:04:41.61474842 +0000 UTC m=+151.918481794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls") pod "image-registry-9b8648dc-vjsvl" (UID: "ad99f7e5-4161-446e-ad87-82c8c5677f5f") : secret "image-registry-tls" not found Apr 16 18:04:41.622771 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:41.622727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:41.623150 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:41.622872 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:04:41.623150 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:41.622893 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9b8648dc-vjsvl: secret "image-registry-tls" not found Apr 16 18:04:41.623150 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:41.622946 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls podName:ad99f7e5-4161-446e-ad87-82c8c5677f5f nodeName:}" failed. No retries permitted until 2026-04-16 18:04:43.622931339 +0000 UTC m=+153.926664713 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls") pod "image-registry-9b8648dc-vjsvl" (UID: "ad99f7e5-4161-446e-ad87-82c8c5677f5f") : secret "image-registry-tls" not found Apr 16 18:04:43.634806 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:43.634754 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:43.635295 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:43.634928 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:04:43.635295 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:43.634950 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9b8648dc-vjsvl: secret "image-registry-tls" not found Apr 16 18:04:43.635295 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:43.635036 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls podName:ad99f7e5-4161-446e-ad87-82c8c5677f5f nodeName:}" failed. No retries permitted until 2026-04-16 18:04:47.635012999 +0000 UTC m=+157.938746396 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls") pod "image-registry-9b8648dc-vjsvl" (UID: "ad99f7e5-4161-446e-ad87-82c8c5677f5f") : secret "image-registry-tls" not found Apr 16 18:04:47.102721 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:47.102658 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-l5w8p" podUID="bf5f2e9a-c558-4494-9601-e1ad83f7056e" Apr 16 18:04:47.118968 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:47.118919 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kf467" podUID="b360b9d8-466f-4200-b65e-ee3078c85b2f" Apr 16 18:04:47.209239 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:47.209206 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lgc4z_179b0e84-c4b0-49f4-af8c-88127e5c999a/dns-node-resolver/0.log" Apr 16 18:04:47.663524 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:47.663491 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:47.663710 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:47.663648 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:04:47.663710 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:47.663664 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9b8648dc-vjsvl: secret "image-registry-tls" not found Apr 16 18:04:47.663796 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:47.663719 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls podName:ad99f7e5-4161-446e-ad87-82c8c5677f5f nodeName:}" failed. No retries permitted until 2026-04-16 18:04:55.66370454 +0000 UTC m=+165.967437913 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls") pod "image-registry-9b8648dc-vjsvl" (UID: "ad99f7e5-4161-446e-ad87-82c8c5677f5f") : secret "image-registry-tls" not found Apr 16 18:04:47.688872 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:47.688847 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:04:47.688970 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:47.688852 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l5w8p" Apr 16 18:04:48.208775 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:48.208749 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-85mvt_e2c3dbfa-ec70-406e-af1c-6ac3c1aba031/node-ca/0.log" Apr 16 18:04:48.258787 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:48.258755 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-vnntp" podUID="e1d07798-4149-483d-a793-b43a2b14fdbf" Apr 16 18:04:51.995057 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:51.995003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:04:51.995057 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:51.995056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:04:51.995510 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:51.995154 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:04:51.995510 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:51.995216 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert podName:b360b9d8-466f-4200-b65e-ee3078c85b2f nodeName:}" failed. No retries permitted until 2026-04-16 18:06:53.995200893 +0000 UTC m=+284.298934270 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert") pod "ingress-canary-kf467" (UID: "b360b9d8-466f-4200-b65e-ee3078c85b2f") : secret "canary-serving-cert" not found Apr 16 18:04:51.995510 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:51.995218 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:04:51.995510 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:04:51.995268 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls podName:bf5f2e9a-c558-4494-9601-e1ad83f7056e nodeName:}" failed. No retries permitted until 2026-04-16 18:06:53.995250985 +0000 UTC m=+284.298984358 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls") pod "dns-default-l5w8p" (UID: "bf5f2e9a-c558-4494-9601-e1ad83f7056e") : secret "dns-default-metrics-tls" not found Apr 16 18:04:55.720652 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:55.720564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:55.723006 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:55.722985 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls\") pod \"image-registry-9b8648dc-vjsvl\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:55.857439 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:55.857408 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:55.998945 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:55.998916 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9b8648dc-vjsvl"] Apr 16 18:04:56.001912 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:04:56.001875 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad99f7e5_4161_446e_ad87_82c8c5677f5f.slice/crio-94df097c11fbfb3a72fd672c6ff6f62e15a84529cc4893823186ba2da5c38979 WatchSource:0}: Error finding container 94df097c11fbfb3a72fd672c6ff6f62e15a84529cc4893823186ba2da5c38979: Status 404 returned error can't find the container with id 94df097c11fbfb3a72fd672c6ff6f62e15a84529cc4893823186ba2da5c38979 Apr 16 18:04:56.709220 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:56.709185 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" event={"ID":"ad99f7e5-4161-446e-ad87-82c8c5677f5f","Type":"ContainerStarted","Data":"0d1e078ea2da862b81d45f20e943183f03a2c2654a795b4a7d008de3fc21bd5f"} Apr 16 18:04:56.709220 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:56.709223 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" event={"ID":"ad99f7e5-4161-446e-ad87-82c8c5677f5f","Type":"ContainerStarted","Data":"94df097c11fbfb3a72fd672c6ff6f62e15a84529cc4893823186ba2da5c38979"} Apr 16 18:04:56.709432 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:56.709322 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:04:56.732956 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:04:56.732918 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" podStartSLOduration=17.732905413 podStartE2EDuration="17.732905413s" podCreationTimestamp="2026-04-16 18:04:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:56.731443923 +0000 UTC m=+167.035177319" watchObservedRunningTime="2026-04-16 18:04:56.732905413 +0000 UTC m=+167.036638808" Apr 16 18:05:01.250340 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:01.250302 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:05:10.292092 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.292050 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-l8df6"] Apr 16 18:05:10.296743 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.296725 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.301533 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.301502 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:05:10.302751 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.302728 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:05:10.303190 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.303171 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:05:10.303516 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.303501 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:05:10.303752 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.303728 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-92dpg\"" Apr 16 18:05:10.318534 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.318497 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l8df6"] Apr 16 18:05:10.322746 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.322715 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-crio-socket\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.322850 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.322769 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-data-volume\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.322850 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.322787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.322850 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.322841 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz26t\" (UniqueName: \"kubernetes.io/projected/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-kube-api-access-qz26t\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.322974 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.322882 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.350327 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.350294 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9b8648dc-vjsvl"] Apr 16 18:05:10.423285 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.423251 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-crio-socket\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.423465 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.423324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-data-volume\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.423465 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.423350 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.423465 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.423374 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qz26t\" (UniqueName: \"kubernetes.io/projected/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-kube-api-access-qz26t\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.423465 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.423380 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-crio-socket\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.423465 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.423412 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.423730 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.423709 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-data-volume\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.424501 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.424483 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.426452 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.426436 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.453884 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.453851 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz26t\" (UniqueName: \"kubernetes.io/projected/d1a46e78-ff9f-4969-b4f5-a5a6f37dc199-kube-api-access-qz26t\") pod \"insights-runtime-extractor-l8df6\" (UID: \"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199\") " pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.609008 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.608917 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l8df6" Apr 16 18:05:10.733525 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.733490 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l8df6"] Apr 16 18:05:10.738318 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:05:10.738287 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1a46e78_ff9f_4969_b4f5_a5a6f37dc199.slice/crio-1b61df2d60ee2e6cb22d296656bfbdd588b2585ae3d259b9ea2d37bde13303f6 WatchSource:0}: Error finding container 1b61df2d60ee2e6cb22d296656bfbdd588b2585ae3d259b9ea2d37bde13303f6: Status 404 returned error can't find the container with id 1b61df2d60ee2e6cb22d296656bfbdd588b2585ae3d259b9ea2d37bde13303f6 Apr 16 18:05:10.743967 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:10.743942 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l8df6" event={"ID":"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199","Type":"ContainerStarted","Data":"1b61df2d60ee2e6cb22d296656bfbdd588b2585ae3d259b9ea2d37bde13303f6"} Apr 16 18:05:11.748322 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:11.748281 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l8df6" event={"ID":"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199","Type":"ContainerStarted","Data":"c63e63d8126757f569bcc1fcd8b24f3784219862f423a339294e3f35b6c0b3fd"} Apr 16 18:05:11.748723 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:11.748329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l8df6" event={"ID":"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199","Type":"ContainerStarted","Data":"f4c6dd7d89cb2197971c1dcb2c930fc9fec27f033e300eaee4508ca76064fed8"} Apr 16 18:05:13.754533 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:13.754504 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l8df6" event={"ID":"d1a46e78-ff9f-4969-b4f5-a5a6f37dc199","Type":"ContainerStarted","Data":"6e76cd44ec653a2c2d6ebd74bb4c4a5d587e6360a86c995ead1875c4b7de5ebf"} Apr 16 18:05:13.775437 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:13.775383 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-l8df6" podStartSLOduration=1.8178989840000002 podStartE2EDuration="3.775366833s" podCreationTimestamp="2026-04-16 18:05:10 +0000 UTC" firstStartedPulling="2026-04-16 18:05:10.796326357 +0000 UTC m=+181.100059734" lastFinishedPulling="2026-04-16 18:05:12.753794207 +0000 UTC m=+183.057527583" observedRunningTime="2026-04-16 18:05:13.774665061 +0000 UTC m=+184.078398456" watchObservedRunningTime="2026-04-16 18:05:13.775366833 +0000 UTC m=+184.079100228" Apr 16 18:05:20.355820 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:20.355793 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:05:22.307369 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.307333 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4mpdb"] Apr 16 18:05:22.310426 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.310410 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.315067 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.315047 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:05:22.315194 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.315096 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:05:22.315358 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.315345 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:05:22.315402 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.315384 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:05:22.316454 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.316435 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:05:22.316583 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.316474 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7x7gd\"" Apr 16 18:05:22.316583 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.316481 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:05:22.408199 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.408160 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-textfile\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.408199 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.408200 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpfss\" (UniqueName: \"kubernetes.io/projected/f4753fcd-d099-4568-8ab9-ba10a06ebc56-kube-api-access-tpfss\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.408400 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.408222 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-tls\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.408400 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.408306 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-accelerators-collector-config\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.408468 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.408417 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4753fcd-d099-4568-8ab9-ba10a06ebc56-sys\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.408468 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.408449 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f4753fcd-d099-4568-8ab9-ba10a06ebc56-root\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.408529 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.408466 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-wtmp\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.408529 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.408497 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4753fcd-d099-4568-8ab9-ba10a06ebc56-metrics-client-ca\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.408529 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.408520 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.509336 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.509297 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.509494 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.509354 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-textfile\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.509494 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.509377 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpfss\" (UniqueName: \"kubernetes.io/projected/f4753fcd-d099-4568-8ab9-ba10a06ebc56-kube-api-access-tpfss\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.509494 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.509395 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-tls\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.509494 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.509414 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-accelerators-collector-config\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.509729 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.509585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4753fcd-d099-4568-8ab9-ba10a06ebc56-sys\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.509729 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.509663 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f4753fcd-d099-4568-8ab9-ba10a06ebc56-root\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.509729 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.509689 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4753fcd-d099-4568-8ab9-ba10a06ebc56-sys\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.509729 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.509690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-wtmp\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.509905 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.509753 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4753fcd-d099-4568-8ab9-ba10a06ebc56-metrics-client-ca\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.509905 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.509835 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-wtmp\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.509905 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.509847 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-textfile\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.509905 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.509848 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f4753fcd-d099-4568-8ab9-ba10a06ebc56-root\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.510180 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.510164 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4753fcd-d099-4568-8ab9-ba10a06ebc56-metrics-client-ca\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.510337 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.510317 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-accelerators-collector-config\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.511826 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.511810 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-tls\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.511893 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.511858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4753fcd-d099-4568-8ab9-ba10a06ebc56-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.524769 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.524742 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpfss\" (UniqueName: \"kubernetes.io/projected/f4753fcd-d099-4568-8ab9-ba10a06ebc56-kube-api-access-tpfss\") pod \"node-exporter-4mpdb\" (UID: \"f4753fcd-d099-4568-8ab9-ba10a06ebc56\") " pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.619155 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.619069 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4mpdb" Apr 16 18:05:22.627617 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:05:22.627551 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4753fcd_d099_4568_8ab9_ba10a06ebc56.slice/crio-70fb9e73381ad1ed7386b13c5616cb3556682084af20266cca3f6b7dfb6ad7d6 WatchSource:0}: Error finding container 70fb9e73381ad1ed7386b13c5616cb3556682084af20266cca3f6b7dfb6ad7d6: Status 404 returned error can't find the container with id 70fb9e73381ad1ed7386b13c5616cb3556682084af20266cca3f6b7dfb6ad7d6 Apr 16 18:05:22.776955 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:22.776915 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4mpdb" event={"ID":"f4753fcd-d099-4568-8ab9-ba10a06ebc56","Type":"ContainerStarted","Data":"70fb9e73381ad1ed7386b13c5616cb3556682084af20266cca3f6b7dfb6ad7d6"} Apr 16 18:05:23.780404 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:23.780371 2578 generic.go:358] "Generic (PLEG): container finished" podID="f4753fcd-d099-4568-8ab9-ba10a06ebc56" containerID="36daee1d6acfe744c749be73b831b123e19be2b87e20bfe8fd678cd8bb1ea4dc" exitCode=0 Apr 16 18:05:23.780764 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:23.780420 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4mpdb" event={"ID":"f4753fcd-d099-4568-8ab9-ba10a06ebc56","Type":"ContainerDied","Data":"36daee1d6acfe744c749be73b831b123e19be2b87e20bfe8fd678cd8bb1ea4dc"} Apr 16 18:05:24.788700 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:24.788667 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4mpdb" event={"ID":"f4753fcd-d099-4568-8ab9-ba10a06ebc56","Type":"ContainerStarted","Data":"fc726350d699606500add353a1f4241813828d15143b4fc11457d46b37d9f46f"} Apr 16 18:05:24.788700 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:24.788702 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4mpdb" event={"ID":"f4753fcd-d099-4568-8ab9-ba10a06ebc56","Type":"ContainerStarted","Data":"e790005b64ac76de2db92251b1591b0d01570684cdadb188fe712bccfc4b8dc0"} Apr 16 18:05:24.810654 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:24.810584 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4mpdb" podStartSLOduration=2.078810426 podStartE2EDuration="2.810568821s" podCreationTimestamp="2026-04-16 18:05:22 +0000 UTC" firstStartedPulling="2026-04-16 18:05:22.629992552 +0000 UTC m=+192.933725939" lastFinishedPulling="2026-04-16 18:05:23.361750957 +0000 UTC m=+193.665484334" observedRunningTime="2026-04-16 18:05:24.809004738 +0000 UTC m=+195.112738144" watchObservedRunningTime="2026-04-16 18:05:24.810568821 +0000 UTC m=+195.114302210" Apr 16 18:05:26.807812 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.807770 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-684947fd6b-tfgf9"] Apr 16 18:05:26.810987 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.810965 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:26.813993 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.813974 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 18:05:26.815776 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.815757 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 18:05:26.815776 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.815771 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:05:26.815920 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.815771 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1lsciilfmmo77\"" Apr 16 18:05:26.815920 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.815765 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 18:05:26.821704 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.821684 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-94xrl\"" Apr 16 18:05:26.825183 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.825162 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-684947fd6b-tfgf9"] Apr 16 18:05:26.947781 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.947745 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d351123e-13f9-4b65-b989-8d01ad589016-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:26.947980 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.947795 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d351123e-13f9-4b65-b989-8d01ad589016-client-ca-bundle\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:26.947980 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.947845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d351123e-13f9-4b65-b989-8d01ad589016-secret-metrics-server-tls\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:26.947980 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.947930 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d351123e-13f9-4b65-b989-8d01ad589016-secret-metrics-server-client-certs\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:26.947980 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.947972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z86j2\" (UniqueName: \"kubernetes.io/projected/d351123e-13f9-4b65-b989-8d01ad589016-kube-api-access-z86j2\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:26.948150 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.948022 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d351123e-13f9-4b65-b989-8d01ad589016-audit-log\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:26.948150 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:26.948047 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d351123e-13f9-4b65-b989-8d01ad589016-metrics-server-audit-profiles\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.048557 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.048520 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d351123e-13f9-4b65-b989-8d01ad589016-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.048731 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.048564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d351123e-13f9-4b65-b989-8d01ad589016-client-ca-bundle\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.048804 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.048725 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d351123e-13f9-4b65-b989-8d01ad589016-secret-metrics-server-tls\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.048804 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.048776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d351123e-13f9-4b65-b989-8d01ad589016-secret-metrics-server-client-certs\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.048906 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.048802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z86j2\" (UniqueName: \"kubernetes.io/projected/d351123e-13f9-4b65-b989-8d01ad589016-kube-api-access-z86j2\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.048906 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.048843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d351123e-13f9-4b65-b989-8d01ad589016-audit-log\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.048906 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.048862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d351123e-13f9-4b65-b989-8d01ad589016-metrics-server-audit-profiles\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.049491 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.049314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d351123e-13f9-4b65-b989-8d01ad589016-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.049491 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.049314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d351123e-13f9-4b65-b989-8d01ad589016-audit-log\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.049771 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.049752 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d351123e-13f9-4b65-b989-8d01ad589016-metrics-server-audit-profiles\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.051300 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.051275 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d351123e-13f9-4b65-b989-8d01ad589016-client-ca-bundle\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.052177 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.052148 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d351123e-13f9-4b65-b989-8d01ad589016-secret-metrics-server-client-certs\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.057424 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.057400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d351123e-13f9-4b65-b989-8d01ad589016-secret-metrics-server-tls\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.057677 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.057661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z86j2\" (UniqueName: \"kubernetes.io/projected/d351123e-13f9-4b65-b989-8d01ad589016-kube-api-access-z86j2\") pod \"metrics-server-684947fd6b-tfgf9\" (UID: \"d351123e-13f9-4b65-b989-8d01ad589016\") " pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.120387 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.120323 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:27.239133 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.239101 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-684947fd6b-tfgf9"] Apr 16 18:05:27.241711 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:05:27.241683 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd351123e_13f9_4b65_b989_8d01ad589016.slice/crio-c1be40e78db2402ce97ebf1cd13e9b74d3d98cd47ddb80725c24e0410c5e4f15 WatchSource:0}: Error finding container c1be40e78db2402ce97ebf1cd13e9b74d3d98cd47ddb80725c24e0410c5e4f15: Status 404 returned error can't find the container with id c1be40e78db2402ce97ebf1cd13e9b74d3d98cd47ddb80725c24e0410c5e4f15 Apr 16 18:05:27.478398 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.478309 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5d655577fb-fhkvt"] Apr 16 18:05:27.482730 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.482709 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.485794 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.485758 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 18:05:27.485911 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.485869 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-vh2x9\"" Apr 16 18:05:27.485911 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.485869 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 18:05:27.486288 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.486157 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 18:05:27.486806 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.486790 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 18:05:27.486862 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.486804 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 18:05:27.493760 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.493739 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 18:05:27.495851 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.495832 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d655577fb-fhkvt"] Apr 16 18:05:27.553322 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.553287 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/68ba8e3f-0e96-499d-bbf5-823e6b695520-secret-telemeter-client\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.553460 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.553329 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68ba8e3f-0e96-499d-bbf5-823e6b695520-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.553460 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.553358 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbdx2\" (UniqueName: \"kubernetes.io/projected/68ba8e3f-0e96-499d-bbf5-823e6b695520-kube-api-access-mbdx2\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.553460 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.553435 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ba8e3f-0e96-499d-bbf5-823e6b695520-serving-certs-ca-bundle\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.553579 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.553478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ba8e3f-0e96-499d-bbf5-823e6b695520-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.553579 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.553547 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/68ba8e3f-0e96-499d-bbf5-823e6b695520-telemeter-client-tls\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.553579 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.553569 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/68ba8e3f-0e96-499d-bbf5-823e6b695520-federate-client-tls\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.553710 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.553622 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68ba8e3f-0e96-499d-bbf5-823e6b695520-metrics-client-ca\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.654296 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.654260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/68ba8e3f-0e96-499d-bbf5-823e6b695520-telemeter-client-tls\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.654296 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.654297 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/68ba8e3f-0e96-499d-bbf5-823e6b695520-federate-client-tls\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.654504 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.654323 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68ba8e3f-0e96-499d-bbf5-823e6b695520-metrics-client-ca\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.654504 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.654481 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/68ba8e3f-0e96-499d-bbf5-823e6b695520-secret-telemeter-client\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.654594 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.654523 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68ba8e3f-0e96-499d-bbf5-823e6b695520-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.654594 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.654542 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbdx2\" (UniqueName: \"kubernetes.io/projected/68ba8e3f-0e96-499d-bbf5-823e6b695520-kube-api-access-mbdx2\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.654727 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.654693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ba8e3f-0e96-499d-bbf5-823e6b695520-serving-certs-ca-bundle\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.654780 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.654756 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ba8e3f-0e96-499d-bbf5-823e6b695520-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.655379 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.655355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ba8e3f-0e96-499d-bbf5-823e6b695520-serving-certs-ca-bundle\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.655379 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.655367 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68ba8e3f-0e96-499d-bbf5-823e6b695520-metrics-client-ca\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.655988 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.655967 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ba8e3f-0e96-499d-bbf5-823e6b695520-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.657052 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.657029 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/68ba8e3f-0e96-499d-bbf5-823e6b695520-telemeter-client-tls\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.657134 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.657075 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/68ba8e3f-0e96-499d-bbf5-823e6b695520-federate-client-tls\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.657358 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.657343 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68ba8e3f-0e96-499d-bbf5-823e6b695520-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.657429 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.657411 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/68ba8e3f-0e96-499d-bbf5-823e6b695520-secret-telemeter-client\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.668526 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.668497 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbdx2\" (UniqueName: \"kubernetes.io/projected/68ba8e3f-0e96-499d-bbf5-823e6b695520-kube-api-access-mbdx2\") pod \"telemeter-client-5d655577fb-fhkvt\" (UID: \"68ba8e3f-0e96-499d-bbf5-823e6b695520\") " pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.792357 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.792331 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" Apr 16 18:05:27.800875 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.800846 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" event={"ID":"d351123e-13f9-4b65-b989-8d01ad589016","Type":"ContainerStarted","Data":"c1be40e78db2402ce97ebf1cd13e9b74d3d98cd47ddb80725c24e0410c5e4f15"} Apr 16 18:05:27.915936 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:27.915905 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d655577fb-fhkvt"] Apr 16 18:05:27.918589 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:05:27.918561 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68ba8e3f_0e96_499d_bbf5_823e6b695520.slice/crio-554276038fce661bc5f0e87506a238d43ffee5bd5708ac81de7e18ed16c15ee0 WatchSource:0}: Error finding container 554276038fce661bc5f0e87506a238d43ffee5bd5708ac81de7e18ed16c15ee0: Status 404 returned error can't find the container with id 554276038fce661bc5f0e87506a238d43ffee5bd5708ac81de7e18ed16c15ee0 Apr 16 18:05:28.494221 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.494183 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:05:28.497893 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.497866 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.502978 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.502952 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:05:28.503208 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.503191 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:05:28.503462 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.503443 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:05:28.503994 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.503926 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3v5jig657vmeb\"" Apr 16 18:05:28.503994 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.503938 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:05:28.504178 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.504159 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:05:28.504481 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.504388 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:05:28.505182 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.505165 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:05:28.516484 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.516461 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:05:28.517543 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.517343 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-9lnm6\"" Apr 16 18:05:28.517543 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.517401 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:05:28.518529 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.518507 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:05:28.518754 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.518721 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:05:28.519076 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.518860 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:05:28.542102 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.542008 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:05:28.561386 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15117bda-c3e8-4779-b823-535e488d3664-config-out\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.561386 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.561633 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561435 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-web-config\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.561633 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561473 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.561633 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561511 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-config\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.561633 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561546 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.561633 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561572 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz5dn\" (UniqueName: \"kubernetes.io/projected/15117bda-c3e8-4779-b823-535e488d3664-kube-api-access-sz5dn\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.561891 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561633 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.561891 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561675 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.561891 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561744 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.561891 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.561891 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561814 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.561891 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561844 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.561891 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561869 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.562167 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561905 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/15117bda-c3e8-4779-b823-535e488d3664-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.562167 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561935 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15117bda-c3e8-4779-b823-535e488d3664-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.562167 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.561957 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.562167 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.562006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663073 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663036 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663241 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663086 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-config\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663241 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663241 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663164 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sz5dn\" (UniqueName: \"kubernetes.io/projected/15117bda-c3e8-4779-b823-535e488d3664-kube-api-access-sz5dn\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663241 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663455 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663455 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663306 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663455 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663330 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663455 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663346 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663455 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663364 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663455 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663455 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663398 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/15117bda-c3e8-4779-b823-535e488d3664-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663455 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663414 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15117bda-c3e8-4779-b823-535e488d3664-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663455 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663429 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663455 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663454 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663929 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15117bda-c3e8-4779-b823-535e488d3664-config-out\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663929 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.663929 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.663637 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-web-config\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.664359 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.664258 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.665132 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.665100 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.665379 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.665319 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.669621 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.669109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-config\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.669621 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.669552 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.669621 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.669583 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/15117bda-c3e8-4779-b823-535e488d3664-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.669823 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.669626 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.671306 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.671274 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15117bda-c3e8-4779-b823-535e488d3664-config-out\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.671478 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.671435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.671727 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.671686 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.671727 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.671702 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15117bda-c3e8-4779-b823-535e488d3664-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.672197 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.672175 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.673009 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.672986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.673107 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.673019 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-web-config\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.673513 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.673468 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.674550 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.674526 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.675382 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.675328 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.676224 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.676180 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz5dn\" (UniqueName: \"kubernetes.io/projected/15117bda-c3e8-4779-b823-535e488d3664-kube-api-access-sz5dn\") pod \"prometheus-k8s-0\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:28.805573 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.805485 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" event={"ID":"68ba8e3f-0e96-499d-bbf5-823e6b695520","Type":"ContainerStarted","Data":"554276038fce661bc5f0e87506a238d43ffee5bd5708ac81de7e18ed16c15ee0"} Apr 16 18:05:28.809762 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:28.809733 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:29.111588 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:29.111545 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:05:29.448556 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:05:29.448481 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15117bda_c3e8_4779_b823_535e488d3664.slice/crio-715cf73399dc09fbd5b79222fceb64e56f95dc9a17e3b3accc94c4c394e66139 WatchSource:0}: Error finding container 715cf73399dc09fbd5b79222fceb64e56f95dc9a17e3b3accc94c4c394e66139: Status 404 returned error can't find the container with id 715cf73399dc09fbd5b79222fceb64e56f95dc9a17e3b3accc94c4c394e66139 Apr 16 18:05:29.810359 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:29.809854 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" event={"ID":"d351123e-13f9-4b65-b989-8d01ad589016","Type":"ContainerStarted","Data":"15e656ead9154d5056eb72b2192a5bffe6226bfa4df99e17c0b75bbc43ec75a5"} Apr 16 18:05:29.811481 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:29.811450 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" event={"ID":"68ba8e3f-0e96-499d-bbf5-823e6b695520","Type":"ContainerStarted","Data":"833d3fc825d1c84c44ba49b55ae6f1e78a7fc5d9cf3e1a9f48dea1b54084e72c"} Apr 16 18:05:29.812770 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:29.812743 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerStarted","Data":"715cf73399dc09fbd5b79222fceb64e56f95dc9a17e3b3accc94c4c394e66139"} Apr 16 18:05:29.829403 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:29.829346 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" podStartSLOduration=1.618892854 podStartE2EDuration="3.82932621s" podCreationTimestamp="2026-04-16 18:05:26 +0000 UTC" firstStartedPulling="2026-04-16 18:05:27.243701253 +0000 UTC m=+197.547434629" lastFinishedPulling="2026-04-16 18:05:29.454134609 +0000 UTC m=+199.757867985" observedRunningTime="2026-04-16 18:05:29.827649266 +0000 UTC m=+200.131382661" watchObservedRunningTime="2026-04-16 18:05:29.82932621 +0000 UTC m=+200.133059607" Apr 16 18:05:30.817755 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:30.817715 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" event={"ID":"68ba8e3f-0e96-499d-bbf5-823e6b695520","Type":"ContainerStarted","Data":"8ab55f0b5c75aba6e1218bea20913f44f4de86ce1ff49009b1ccb2187800f288"} Apr 16 18:05:30.817755 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:30.817759 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" event={"ID":"68ba8e3f-0e96-499d-bbf5-823e6b695520","Type":"ContainerStarted","Data":"c93f13649b645b976d83ec6f4e7f1ae385d4a78638b145c1f2020fde0ba81c6f"} Apr 16 18:05:30.819068 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:30.819045 2578 generic.go:358] "Generic (PLEG): container finished" podID="15117bda-c3e8-4779-b823-535e488d3664" containerID="ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f" exitCode=0 Apr 16 18:05:30.819194 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:30.819143 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerDied","Data":"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f"} Apr 16 18:05:30.844061 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:30.844004 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5d655577fb-fhkvt" podStartSLOduration=1.39799589 podStartE2EDuration="3.843987811s" podCreationTimestamp="2026-04-16 18:05:27 +0000 UTC" firstStartedPulling="2026-04-16 18:05:27.920429792 +0000 UTC m=+198.224163168" lastFinishedPulling="2026-04-16 18:05:30.366421713 +0000 UTC m=+200.670155089" observedRunningTime="2026-04-16 18:05:30.842717936 +0000 UTC m=+201.146451325" watchObservedRunningTime="2026-04-16 18:05:30.843987811 +0000 UTC m=+201.147721277" Apr 16 18:05:33.829822 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:33.829737 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerStarted","Data":"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742"} Apr 16 18:05:33.829822 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:33.829779 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerStarted","Data":"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8"} Apr 16 18:05:35.370141 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.370090 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" podUID="ad99f7e5-4161-446e-ad87-82c8c5677f5f" containerName="registry" containerID="cri-o://0d1e078ea2da862b81d45f20e943183f03a2c2654a795b4a7d008de3fc21bd5f" gracePeriod=30 Apr 16 18:05:35.643437 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.643337 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:05:35.731914 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.731895 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-bound-sa-token\") pod \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " Apr 16 18:05:35.731994 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.731945 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad99f7e5-4161-446e-ad87-82c8c5677f5f-installation-pull-secrets\") pod \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " Apr 16 18:05:35.731994 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.731964 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad99f7e5-4161-446e-ad87-82c8c5677f5f-trusted-ca\") pod \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " Apr 16 18:05:35.732103 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.731992 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xflm7\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-kube-api-access-xflm7\") pod \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " Apr 16 18:05:35.732196 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.732170 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls\") pod \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " Apr 16 18:05:35.732273 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.732256 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad99f7e5-4161-446e-ad87-82c8c5677f5f-image-registry-private-configuration\") pod \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " Apr 16 18:05:35.732332 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.732289 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad99f7e5-4161-446e-ad87-82c8c5677f5f-ca-trust-extracted\") pod \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " Apr 16 18:05:35.732386 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.732338 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-certificates\") pod \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\" (UID: \"ad99f7e5-4161-446e-ad87-82c8c5677f5f\") " Apr 16 18:05:35.732436 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.732404 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad99f7e5-4161-446e-ad87-82c8c5677f5f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ad99f7e5-4161-446e-ad87-82c8c5677f5f" (UID: "ad99f7e5-4161-446e-ad87-82c8c5677f5f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:35.732688 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.732668 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad99f7e5-4161-446e-ad87-82c8c5677f5f-trusted-ca\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:05:35.733437 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.733412 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ad99f7e5-4161-446e-ad87-82c8c5677f5f" (UID: "ad99f7e5-4161-446e-ad87-82c8c5677f5f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:35.735088 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.735059 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-kube-api-access-xflm7" (OuterVolumeSpecName: "kube-api-access-xflm7") pod "ad99f7e5-4161-446e-ad87-82c8c5677f5f" (UID: "ad99f7e5-4161-446e-ad87-82c8c5677f5f"). InnerVolumeSpecName "kube-api-access-xflm7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:35.735191 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.735178 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad99f7e5-4161-446e-ad87-82c8c5677f5f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ad99f7e5-4161-446e-ad87-82c8c5677f5f" (UID: "ad99f7e5-4161-446e-ad87-82c8c5677f5f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:35.735247 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.735183 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ad99f7e5-4161-446e-ad87-82c8c5677f5f" (UID: "ad99f7e5-4161-446e-ad87-82c8c5677f5f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:35.735369 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.735338 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad99f7e5-4161-446e-ad87-82c8c5677f5f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ad99f7e5-4161-446e-ad87-82c8c5677f5f" (UID: "ad99f7e5-4161-446e-ad87-82c8c5677f5f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:35.735721 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.735700 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ad99f7e5-4161-446e-ad87-82c8c5677f5f" (UID: "ad99f7e5-4161-446e-ad87-82c8c5677f5f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:35.742188 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.742151 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad99f7e5-4161-446e-ad87-82c8c5677f5f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ad99f7e5-4161-446e-ad87-82c8c5677f5f" (UID: "ad99f7e5-4161-446e-ad87-82c8c5677f5f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:05:35.833085 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.833057 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad99f7e5-4161-446e-ad87-82c8c5677f5f-image-registry-private-configuration\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:05:35.833085 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.833084 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad99f7e5-4161-446e-ad87-82c8c5677f5f-ca-trust-extracted\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:05:35.833315 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.833096 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-certificates\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:05:35.833315 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.833105 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-bound-sa-token\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:05:35.833315 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.833115 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad99f7e5-4161-446e-ad87-82c8c5677f5f-installation-pull-secrets\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:05:35.833315 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.833125 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xflm7\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-kube-api-access-xflm7\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:05:35.833315 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.833135 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad99f7e5-4161-446e-ad87-82c8c5677f5f-registry-tls\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:05:35.835940 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.835912 2578 generic.go:358] "Generic (PLEG): container finished" podID="ad99f7e5-4161-446e-ad87-82c8c5677f5f" containerID="0d1e078ea2da862b81d45f20e943183f03a2c2654a795b4a7d008de3fc21bd5f" exitCode=0 Apr 16 18:05:35.836056 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.835974 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" Apr 16 18:05:35.836056 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.836006 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" event={"ID":"ad99f7e5-4161-446e-ad87-82c8c5677f5f","Type":"ContainerDied","Data":"0d1e078ea2da862b81d45f20e943183f03a2c2654a795b4a7d008de3fc21bd5f"} Apr 16 18:05:35.836056 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.836047 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9b8648dc-vjsvl" event={"ID":"ad99f7e5-4161-446e-ad87-82c8c5677f5f","Type":"ContainerDied","Data":"94df097c11fbfb3a72fd672c6ff6f62e15a84529cc4893823186ba2da5c38979"} Apr 16 18:05:35.836218 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.836067 2578 scope.go:117] "RemoveContainer" containerID="0d1e078ea2da862b81d45f20e943183f03a2c2654a795b4a7d008de3fc21bd5f" Apr 16 18:05:35.839252 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.839229 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerStarted","Data":"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b"} Apr 16 18:05:35.839339 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.839255 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerStarted","Data":"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628"} Apr 16 18:05:35.839339 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.839265 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerStarted","Data":"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4"} Apr 16 18:05:35.839339 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.839278 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerStarted","Data":"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add"} Apr 16 18:05:35.844820 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.844782 2578 scope.go:117] "RemoveContainer" containerID="0d1e078ea2da862b81d45f20e943183f03a2c2654a795b4a7d008de3fc21bd5f" Apr 16 18:05:35.845031 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:05:35.845012 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d1e078ea2da862b81d45f20e943183f03a2c2654a795b4a7d008de3fc21bd5f\": container with ID starting with 0d1e078ea2da862b81d45f20e943183f03a2c2654a795b4a7d008de3fc21bd5f not found: ID does not exist" containerID="0d1e078ea2da862b81d45f20e943183f03a2c2654a795b4a7d008de3fc21bd5f" Apr 16 18:05:35.845093 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.845039 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1e078ea2da862b81d45f20e943183f03a2c2654a795b4a7d008de3fc21bd5f"} err="failed to get container status \"0d1e078ea2da862b81d45f20e943183f03a2c2654a795b4a7d008de3fc21bd5f\": rpc error: code = NotFound desc = could not find container \"0d1e078ea2da862b81d45f20e943183f03a2c2654a795b4a7d008de3fc21bd5f\": container with ID starting with 0d1e078ea2da862b81d45f20e943183f03a2c2654a795b4a7d008de3fc21bd5f not found: ID does not exist" Apr 16 18:05:35.870528 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.870490 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.854318106 podStartE2EDuration="7.870479634s" podCreationTimestamp="2026-04-16 18:05:28 +0000 UTC" firstStartedPulling="2026-04-16 18:05:29.450563923 +0000 UTC m=+199.754297297" lastFinishedPulling="2026-04-16 18:05:35.466725448 +0000 UTC m=+205.770458825" observedRunningTime="2026-04-16 18:05:35.869100487 +0000 UTC m=+206.172834049" watchObservedRunningTime="2026-04-16 18:05:35.870479634 +0000 UTC m=+206.174213027" Apr 16 18:05:35.886112 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.886061 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9b8648dc-vjsvl"] Apr 16 18:05:35.891188 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:35.891167 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-9b8648dc-vjsvl"] Apr 16 18:05:36.254582 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:36.254547 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad99f7e5-4161-446e-ad87-82c8c5677f5f" path="/var/lib/kubelet/pods/ad99f7e5-4161-446e-ad87-82c8c5677f5f/volumes" Apr 16 18:05:38.810760 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:38.810706 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:42.835497 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:42.835434 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" podUID="ddae9640-751b-43e0-afe8-180c52b1a0ce" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:05:47.120723 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:47.120662 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:47.120723 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:47.120726 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:05:52.835981 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:05:52.835936 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" podUID="ddae9640-751b-43e0-afe8-180c52b1a0ce" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:06:02.836089 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:02.836049 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" podUID="ddae9640-751b-43e0-afe8-180c52b1a0ce" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:06:02.836443 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:02.836128 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" Apr 16 18:06:02.836584 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:02.836566 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"364352ef001efe1da543418b53ee4ebd0053516d7a73ecd28b2c56accfbdfcc5"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 18:06:02.836649 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:02.836618 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" podUID="ddae9640-751b-43e0-afe8-180c52b1a0ce" containerName="service-proxy" containerID="cri-o://364352ef001efe1da543418b53ee4ebd0053516d7a73ecd28b2c56accfbdfcc5" gracePeriod=30 Apr 16 18:06:03.914056 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:03.914025 2578 generic.go:358] "Generic (PLEG): container finished" podID="ddae9640-751b-43e0-afe8-180c52b1a0ce" containerID="364352ef001efe1da543418b53ee4ebd0053516d7a73ecd28b2c56accfbdfcc5" exitCode=2 Apr 16 18:06:03.914431 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:03.914092 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" event={"ID":"ddae9640-751b-43e0-afe8-180c52b1a0ce","Type":"ContainerDied","Data":"364352ef001efe1da543418b53ee4ebd0053516d7a73ecd28b2c56accfbdfcc5"} Apr 16 18:06:03.914431 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:03.914130 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f859c47df-tsh8s" event={"ID":"ddae9640-751b-43e0-afe8-180c52b1a0ce","Type":"ContainerStarted","Data":"075c637a3e117ba9f64b05a0a13c947668c6665993f6a83f0acfca5ce1ab31be"} Apr 16 18:06:07.125906 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:07.125878 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:06:07.129693 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:07.129670 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-684947fd6b-tfgf9" Apr 16 18:06:22.011626 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:22.011576 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:06:22.014022 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:22.014001 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1d07798-4149-483d-a793-b43a2b14fdbf-metrics-certs\") pod \"network-metrics-daemon-vnntp\" (UID: \"e1d07798-4149-483d-a793-b43a2b14fdbf\") " pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:06:22.254415 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:22.254387 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jt7dn\"" Apr 16 18:06:22.261896 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:22.261835 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vnntp" Apr 16 18:06:22.378832 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:22.378697 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vnntp"] Apr 16 18:06:22.381279 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:06:22.381250 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1d07798_4149_483d_a793_b43a2b14fdbf.slice/crio-46d4780d8306de12efa393aff1b375efd43b8776515938bbfd527e92ee504f9f WatchSource:0}: Error finding container 46d4780d8306de12efa393aff1b375efd43b8776515938bbfd527e92ee504f9f: Status 404 returned error can't find the container with id 46d4780d8306de12efa393aff1b375efd43b8776515938bbfd527e92ee504f9f Apr 16 18:06:22.961973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:22.961938 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vnntp" event={"ID":"e1d07798-4149-483d-a793-b43a2b14fdbf","Type":"ContainerStarted","Data":"46d4780d8306de12efa393aff1b375efd43b8776515938bbfd527e92ee504f9f"} Apr 16 18:06:23.966839 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:23.966802 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vnntp" event={"ID":"e1d07798-4149-483d-a793-b43a2b14fdbf","Type":"ContainerStarted","Data":"6261e88537d3e6b14e7f1d2e2fee7cd88d71b9f828eb37a5196202e935c9c976"} Apr 16 18:06:23.966839 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:23.966843 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vnntp" event={"ID":"e1d07798-4149-483d-a793-b43a2b14fdbf","Type":"ContainerStarted","Data":"b0cd5e5f755e8f7b326e2ff028da4f69d594ae954274c127998ea9ba4d63bd0e"} Apr 16 18:06:23.985411 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:23.985251 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vnntp" podStartSLOduration=253.044969523 podStartE2EDuration="4m13.985233062s" podCreationTimestamp="2026-04-16 18:02:10 +0000 UTC" firstStartedPulling="2026-04-16 18:06:22.383258192 +0000 UTC m=+252.686991570" lastFinishedPulling="2026-04-16 18:06:23.323521732 +0000 UTC m=+253.627255109" observedRunningTime="2026-04-16 18:06:23.984021828 +0000 UTC m=+254.287755223" watchObservedRunningTime="2026-04-16 18:06:23.985233062 +0000 UTC m=+254.288966460" Apr 16 18:06:28.810498 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:28.810408 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:28.830120 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:28.830094 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:28.995349 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:28.995321 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:46.995306 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:46.995272 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:06:46.995835 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:46.995714 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="prometheus" containerID="cri-o://f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8" gracePeriod=600 Apr 16 18:06:46.995835 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:46.995732 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="thanos-sidecar" containerID="cri-o://4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add" gracePeriod=600 Apr 16 18:06:46.995835 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:46.995732 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="kube-rbac-proxy" containerID="cri-o://3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628" gracePeriod=600 Apr 16 18:06:46.995835 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:46.995799 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="kube-rbac-proxy-web" containerID="cri-o://8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4" gracePeriod=600 Apr 16 18:06:46.996047 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:46.995812 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="kube-rbac-proxy-thanos" containerID="cri-o://a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b" gracePeriod=600 Apr 16 18:06:46.996047 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:46.995794 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="config-reloader" containerID="cri-o://47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742" gracePeriod=600 Apr 16 18:06:47.317769 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.317744 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:47.408683 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.408651 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-serving-certs-ca-bundle\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.408869 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.408841 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.408938 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.408885 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-thanos-prometheus-http-client-file\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.408938 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.408923 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-tls\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409038 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.408957 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409038 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.408980 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:47.409038 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.408989 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-prometheus-trusted-ca-bundle\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409043 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-config\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409072 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-metrics-client-ca\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409103 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15117bda-c3e8-4779-b823-535e488d3664-config-out\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409133 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-kube-rbac-proxy\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409160 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-prometheus-k8s-rulefiles-0\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409431 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409332 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-web-config\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409431 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409357 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15117bda-c3e8-4779-b823-535e488d3664-tls-assets\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409431 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409375 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:47.409431 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409398 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-metrics-client-certs\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409648 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409432 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz5dn\" (UniqueName: \"kubernetes.io/projected/15117bda-c3e8-4779-b823-535e488d3664-kube-api-access-sz5dn\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409648 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409460 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-kubelet-serving-ca-bundle\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409648 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409524 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-grpc-tls\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409648 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409565 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/15117bda-c3e8-4779-b823-535e488d3664-prometheus-k8s-db\") pod \"15117bda-c3e8-4779-b823-535e488d3664\" (UID: \"15117bda-c3e8-4779-b823-535e488d3664\") " Apr 16 18:06:47.409852 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409786 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:47.412280 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.409981 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.412280 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.410009 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-prometheus-trusted-ca-bundle\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.412280 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.410027 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-metrics-client-ca\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.412280 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.410670 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:47.412280 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.411647 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:47.412280 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.411918 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:47.412280 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.411970 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:47.412280 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.412198 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:47.412857 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.412826 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:47.412988 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.412888 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:47.413049 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.412994 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15117bda-c3e8-4779-b823-535e488d3664-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:06:47.413552 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.413528 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:47.413664 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.413582 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15117bda-c3e8-4779-b823-535e488d3664-kube-api-access-sz5dn" (OuterVolumeSpecName: "kube-api-access-sz5dn") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "kube-api-access-sz5dn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:06:47.413664 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.413618 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15117bda-c3e8-4779-b823-535e488d3664-config-out" (OuterVolumeSpecName: "config-out") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:06:47.414250 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.414234 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:47.414330 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.414306 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-config" (OuterVolumeSpecName: "config") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:47.414863 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.414832 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15117bda-c3e8-4779-b823-535e488d3664-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:06:47.424069 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.424048 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-web-config" (OuterVolumeSpecName: "web-config") pod "15117bda-c3e8-4779-b823-535e488d3664" (UID: "15117bda-c3e8-4779-b823-535e488d3664"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:47.510533 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510437 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-web-config\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510533 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510469 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15117bda-c3e8-4779-b823-535e488d3664-tls-assets\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510533 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510479 2578 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-metrics-client-certs\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510533 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510490 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sz5dn\" (UniqueName: \"kubernetes.io/projected/15117bda-c3e8-4779-b823-535e488d3664-kube-api-access-sz5dn\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510533 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510500 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510533 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510509 2578 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-grpc-tls\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510533 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510518 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/15117bda-c3e8-4779-b823-535e488d3664-prometheus-k8s-db\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510533 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510528 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510533 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510537 2578 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-thanos-prometheus-http-client-file\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510943 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510546 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-tls\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510943 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510556 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510943 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510564 2578 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-config\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510943 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510572 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15117bda-c3e8-4779-b823-535e488d3664-config-out\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510943 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510582 2578 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/15117bda-c3e8-4779-b823-535e488d3664-secret-kube-rbac-proxy\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:47.510943 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:47.510590 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/15117bda-c3e8-4779-b823-535e488d3664-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:06:48.029291 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029264 2578 generic.go:358] "Generic (PLEG): container finished" podID="15117bda-c3e8-4779-b823-535e488d3664" containerID="a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b" exitCode=0 Apr 16 18:06:48.029291 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029285 2578 generic.go:358] "Generic (PLEG): container finished" podID="15117bda-c3e8-4779-b823-535e488d3664" containerID="3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628" exitCode=0 Apr 16 18:06:48.029291 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029291 2578 generic.go:358] "Generic (PLEG): container finished" podID="15117bda-c3e8-4779-b823-535e488d3664" containerID="8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4" exitCode=0 Apr 16 18:06:48.029794 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029297 2578 generic.go:358] "Generic (PLEG): container finished" podID="15117bda-c3e8-4779-b823-535e488d3664" containerID="4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add" exitCode=0 Apr 16 18:06:48.029794 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029302 2578 generic.go:358] "Generic (PLEG): container finished" podID="15117bda-c3e8-4779-b823-535e488d3664" containerID="47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742" exitCode=0 Apr 16 18:06:48.029794 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029307 2578 generic.go:358] "Generic (PLEG): container finished" podID="15117bda-c3e8-4779-b823-535e488d3664" containerID="f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8" exitCode=0 Apr 16 18:06:48.029794 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029326 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerDied","Data":"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b"} Apr 16 18:06:48.029794 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029352 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerDied","Data":"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628"} Apr 16 18:06:48.029794 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029362 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerDied","Data":"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4"} Apr 16 18:06:48.029794 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029371 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerDied","Data":"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add"} Apr 16 18:06:48.029794 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029381 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerDied","Data":"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742"} Apr 16 18:06:48.029794 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029388 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerDied","Data":"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8"} Apr 16 18:06:48.029794 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029396 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15117bda-c3e8-4779-b823-535e488d3664","Type":"ContainerDied","Data":"715cf73399dc09fbd5b79222fceb64e56f95dc9a17e3b3accc94c4c394e66139"} Apr 16 18:06:48.029794 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029409 2578 scope.go:117] "RemoveContainer" containerID="a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b" Apr 16 18:06:48.029794 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.029470 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.036889 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.036794 2578 scope.go:117] "RemoveContainer" containerID="3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628" Apr 16 18:06:48.043596 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.043582 2578 scope.go:117] "RemoveContainer" containerID="8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4" Apr 16 18:06:48.049511 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.049493 2578 scope.go:117] "RemoveContainer" containerID="4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add" Apr 16 18:06:48.054570 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.054546 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:06:48.056159 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.056057 2578 scope.go:117] "RemoveContainer" containerID="47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742" Apr 16 18:06:48.057754 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.057727 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:06:48.064510 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.063742 2578 scope.go:117] "RemoveContainer" containerID="f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8" Apr 16 18:06:48.071176 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.071147 2578 scope.go:117] "RemoveContainer" containerID="ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f" Apr 16 18:06:48.077482 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.077463 2578 scope.go:117] "RemoveContainer" containerID="a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b" Apr 16 18:06:48.077796 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:06:48.077774 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b\": container with ID starting with a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b not found: ID does not exist" containerID="a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b" Apr 16 18:06:48.077851 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.077808 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b"} err="failed to get container status \"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b\": rpc error: code = NotFound desc = could not find container \"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b\": container with ID starting with a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b not found: ID does not exist" Apr 16 18:06:48.077851 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.077835 2578 scope.go:117] "RemoveContainer" containerID="3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628" Apr 16 18:06:48.078084 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:06:48.078065 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628\": container with ID starting with 3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628 not found: ID does not exist" containerID="3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628" Apr 16 18:06:48.078138 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.078091 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628"} err="failed to get container status \"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628\": rpc error: code = NotFound desc = could not find container \"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628\": container with ID starting with 3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628 not found: ID does not exist" Apr 16 18:06:48.078138 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.078107 2578 scope.go:117] "RemoveContainer" containerID="8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4" Apr 16 18:06:48.078328 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:06:48.078311 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4\": container with ID starting with 8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4 not found: ID does not exist" containerID="8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4" Apr 16 18:06:48.078391 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.078349 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4"} err="failed to get container status \"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4\": rpc error: code = NotFound desc = could not find container \"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4\": container with ID starting with 8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4 not found: ID does not exist" Apr 16 18:06:48.078391 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.078372 2578 scope.go:117] "RemoveContainer" containerID="4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add" Apr 16 18:06:48.078634 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:06:48.078617 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add\": container with ID starting with 4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add not found: ID does not exist" containerID="4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add" Apr 16 18:06:48.078680 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.078638 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add"} err="failed to get container status \"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add\": rpc error: code = NotFound desc = could not find container \"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add\": container with ID starting with 4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add not found: ID does not exist" Apr 16 18:06:48.078680 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.078652 2578 scope.go:117] "RemoveContainer" containerID="47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742" Apr 16 18:06:48.078847 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:06:48.078832 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742\": container with ID starting with 47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742 not found: ID does not exist" containerID="47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742" Apr 16 18:06:48.078888 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.078853 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742"} err="failed to get container status \"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742\": rpc error: code = NotFound desc = could not find container \"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742\": container with ID starting with 47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742 not found: ID does not exist" Apr 16 18:06:48.078888 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.078866 2578 scope.go:117] "RemoveContainer" containerID="f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8" Apr 16 18:06:48.079057 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:06:48.079040 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8\": container with ID starting with f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8 not found: ID does not exist" containerID="f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8" Apr 16 18:06:48.079121 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.079064 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8"} err="failed to get container status \"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8\": rpc error: code = NotFound desc = could not find container \"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8\": container with ID starting with f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8 not found: ID does not exist" Apr 16 18:06:48.079121 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.079083 2578 scope.go:117] "RemoveContainer" containerID="ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f" Apr 16 18:06:48.079322 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:06:48.079306 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f\": container with ID starting with ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f not found: ID does not exist" containerID="ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f" Apr 16 18:06:48.079359 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.079326 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f"} err="failed to get container status \"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f\": rpc error: code = NotFound desc = could not find container \"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f\": container with ID starting with ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f not found: ID does not exist" Apr 16 18:06:48.079359 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.079341 2578 scope.go:117] "RemoveContainer" containerID="a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b" Apr 16 18:06:48.079551 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.079534 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b"} err="failed to get container status \"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b\": rpc error: code = NotFound desc = could not find container \"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b\": container with ID starting with a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b not found: ID does not exist" Apr 16 18:06:48.079592 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.079551 2578 scope.go:117] "RemoveContainer" containerID="3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628" Apr 16 18:06:48.079764 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.079746 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628"} err="failed to get container status \"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628\": rpc error: code = NotFound desc = could not find container \"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628\": container with ID starting with 3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628 not found: ID does not exist" Apr 16 18:06:48.079830 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.079766 2578 scope.go:117] "RemoveContainer" containerID="8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4" Apr 16 18:06:48.079961 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.079942 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4"} err="failed to get container status \"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4\": rpc error: code = NotFound desc = could not find container \"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4\": container with ID starting with 8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4 not found: ID does not exist" Apr 16 18:06:48.080009 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.079962 2578 scope.go:117] "RemoveContainer" containerID="4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add" Apr 16 18:06:48.080150 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.080135 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add"} err="failed to get container status \"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add\": rpc error: code = NotFound desc = could not find container \"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add\": container with ID starting with 4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add not found: ID does not exist" Apr 16 18:06:48.080201 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.080150 2578 scope.go:117] "RemoveContainer" containerID="47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742" Apr 16 18:06:48.080324 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.080308 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742"} err="failed to get container status \"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742\": rpc error: code = NotFound desc = could not find container \"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742\": container with ID starting with 47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742 not found: ID does not exist" Apr 16 18:06:48.080367 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.080324 2578 scope.go:117] "RemoveContainer" containerID="f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8" Apr 16 18:06:48.080498 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.080481 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8"} err="failed to get container status \"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8\": rpc error: code = NotFound desc = could not find container \"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8\": container with ID starting with f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8 not found: ID does not exist" Apr 16 18:06:48.080539 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.080498 2578 scope.go:117] "RemoveContainer" containerID="ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f" Apr 16 18:06:48.080690 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.080675 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f"} err="failed to get container status \"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f\": rpc error: code = NotFound desc = could not find container \"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f\": container with ID starting with ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f not found: ID does not exist" Apr 16 18:06:48.080739 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.080689 2578 scope.go:117] "RemoveContainer" containerID="a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b" Apr 16 18:06:48.080884 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.080869 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b"} err="failed to get container status \"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b\": rpc error: code = NotFound desc = could not find container \"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b\": container with ID starting with a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b not found: ID does not exist" Apr 16 18:06:48.080926 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.080883 2578 scope.go:117] "RemoveContainer" containerID="3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628" Apr 16 18:06:48.081061 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.081044 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628"} err="failed to get container status \"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628\": rpc error: code = NotFound desc = could not find container \"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628\": container with ID starting with 3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628 not found: ID does not exist" Apr 16 18:06:48.081103 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.081062 2578 scope.go:117] "RemoveContainer" containerID="8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4" Apr 16 18:06:48.081248 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.081227 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4"} err="failed to get container status \"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4\": rpc error: code = NotFound desc = could not find container \"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4\": container with ID starting with 8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4 not found: ID does not exist" Apr 16 18:06:48.081248 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.081247 2578 scope.go:117] "RemoveContainer" containerID="4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add" Apr 16 18:06:48.081448 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.081431 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add"} err="failed to get container status \"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add\": rpc error: code = NotFound desc = could not find container \"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add\": container with ID starting with 4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add not found: ID does not exist" Apr 16 18:06:48.081448 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.081447 2578 scope.go:117] "RemoveContainer" containerID="47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742" Apr 16 18:06:48.081670 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.081649 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742"} err="failed to get container status \"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742\": rpc error: code = NotFound desc = could not find container \"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742\": container with ID starting with 47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742 not found: ID does not exist" Apr 16 18:06:48.081738 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.081672 2578 scope.go:117] "RemoveContainer" containerID="f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8" Apr 16 18:06:48.081898 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.081883 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8"} err="failed to get container status \"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8\": rpc error: code = NotFound desc = could not find container \"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8\": container with ID starting with f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8 not found: ID does not exist" Apr 16 18:06:48.081962 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.081898 2578 scope.go:117] "RemoveContainer" containerID="ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f" Apr 16 18:06:48.082112 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.082094 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f"} err="failed to get container status \"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f\": rpc error: code = NotFound desc = could not find container \"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f\": container with ID starting with ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f not found: ID does not exist" Apr 16 18:06:48.082163 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.082112 2578 scope.go:117] "RemoveContainer" containerID="a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b" Apr 16 18:06:48.082288 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.082269 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b"} err="failed to get container status \"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b\": rpc error: code = NotFound desc = could not find container \"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b\": container with ID starting with a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b not found: ID does not exist" Apr 16 18:06:48.082325 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.082288 2578 scope.go:117] "RemoveContainer" containerID="3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628" Apr 16 18:06:48.082454 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.082439 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628"} err="failed to get container status \"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628\": rpc error: code = NotFound desc = could not find container \"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628\": container with ID starting with 3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628 not found: ID does not exist" Apr 16 18:06:48.082492 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.082454 2578 scope.go:117] "RemoveContainer" containerID="8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4" Apr 16 18:06:48.082629 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.082595 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4"} err="failed to get container status \"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4\": rpc error: code = NotFound desc = could not find container \"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4\": container with ID starting with 8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4 not found: ID does not exist" Apr 16 18:06:48.082676 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.082631 2578 scope.go:117] "RemoveContainer" containerID="4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add" Apr 16 18:06:48.082797 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.082782 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add"} err="failed to get container status \"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add\": rpc error: code = NotFound desc = could not find container \"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add\": container with ID starting with 4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add not found: ID does not exist" Apr 16 18:06:48.082832 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.082797 2578 scope.go:117] "RemoveContainer" containerID="47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742" Apr 16 18:06:48.082946 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.082930 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742"} err="failed to get container status \"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742\": rpc error: code = NotFound desc = could not find container \"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742\": container with ID starting with 47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742 not found: ID does not exist" Apr 16 18:06:48.082946 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.082946 2578 scope.go:117] "RemoveContainer" containerID="f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8" Apr 16 18:06:48.083149 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.083134 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8"} err="failed to get container status \"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8\": rpc error: code = NotFound desc = could not find container \"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8\": container with ID starting with f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8 not found: ID does not exist" Apr 16 18:06:48.083189 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.083150 2578 scope.go:117] "RemoveContainer" containerID="ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f" Apr 16 18:06:48.083328 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.083307 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f"} err="failed to get container status \"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f\": rpc error: code = NotFound desc = could not find container \"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f\": container with ID starting with ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f not found: ID does not exist" Apr 16 18:06:48.083387 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.083330 2578 scope.go:117] "RemoveContainer" containerID="a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b" Apr 16 18:06:48.083531 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.083512 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b"} err="failed to get container status \"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b\": rpc error: code = NotFound desc = could not find container \"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b\": container with ID starting with a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b not found: ID does not exist" Apr 16 18:06:48.083531 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.083529 2578 scope.go:117] "RemoveContainer" containerID="3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628" Apr 16 18:06:48.083761 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.083738 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628"} err="failed to get container status \"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628\": rpc error: code = NotFound desc = could not find container \"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628\": container with ID starting with 3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628 not found: ID does not exist" Apr 16 18:06:48.083761 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.083759 2578 scope.go:117] "RemoveContainer" containerID="8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4" Apr 16 18:06:48.084000 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.083980 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4"} err="failed to get container status \"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4\": rpc error: code = NotFound desc = could not find container \"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4\": container with ID starting with 8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4 not found: ID does not exist" Apr 16 18:06:48.084052 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.084001 2578 scope.go:117] "RemoveContainer" containerID="4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add" Apr 16 18:06:48.084190 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.084174 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add"} err="failed to get container status \"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add\": rpc error: code = NotFound desc = could not find container \"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add\": container with ID starting with 4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add not found: ID does not exist" Apr 16 18:06:48.084235 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.084190 2578 scope.go:117] "RemoveContainer" containerID="47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742" Apr 16 18:06:48.084357 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.084341 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742"} err="failed to get container status \"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742\": rpc error: code = NotFound desc = could not find container \"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742\": container with ID starting with 47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742 not found: ID does not exist" Apr 16 18:06:48.084393 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.084359 2578 scope.go:117] "RemoveContainer" containerID="f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8" Apr 16 18:06:48.084559 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.084543 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8"} err="failed to get container status \"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8\": rpc error: code = NotFound desc = could not find container \"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8\": container with ID starting with f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8 not found: ID does not exist" Apr 16 18:06:48.084622 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.084559 2578 scope.go:117] "RemoveContainer" containerID="ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f" Apr 16 18:06:48.084774 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.084756 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f"} err="failed to get container status \"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f\": rpc error: code = NotFound desc = could not find container \"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f\": container with ID starting with ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f not found: ID does not exist" Apr 16 18:06:48.084826 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.084774 2578 scope.go:117] "RemoveContainer" containerID="a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b" Apr 16 18:06:48.084970 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.084952 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b"} err="failed to get container status \"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b\": rpc error: code = NotFound desc = could not find container \"a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b\": container with ID starting with a85604b4ecb148b8c99b4e21ea2caa87c1042ddd6e0e92d3a08699ac5291ff7b not found: ID does not exist" Apr 16 18:06:48.085009 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.084971 2578 scope.go:117] "RemoveContainer" containerID="3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628" Apr 16 18:06:48.085164 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.085150 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628"} err="failed to get container status \"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628\": rpc error: code = NotFound desc = could not find container \"3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628\": container with ID starting with 3b94e33a007b51c1928ca6eb8de34d1bb3ddc8e5dae59e64ee9d9d97c066c628 not found: ID does not exist" Apr 16 18:06:48.085210 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.085165 2578 scope.go:117] "RemoveContainer" containerID="8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4" Apr 16 18:06:48.085360 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.085335 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4"} err="failed to get container status \"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4\": rpc error: code = NotFound desc = could not find container \"8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4\": container with ID starting with 8507c61f0e8aa793260e2bdd4df0d1bb410524e5b25ed2d811a12ec12feb48d4 not found: ID does not exist" Apr 16 18:06:48.085360 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.085356 2578 scope.go:117] "RemoveContainer" containerID="4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add" Apr 16 18:06:48.085516 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.085497 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add"} err="failed to get container status \"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add\": rpc error: code = NotFound desc = could not find container \"4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add\": container with ID starting with 4d51b6239f53298e8e598551b1f287ffbf9c4a1158de39b73ea40d805b5b6add not found: ID does not exist" Apr 16 18:06:48.085516 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.085514 2578 scope.go:117] "RemoveContainer" containerID="47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742" Apr 16 18:06:48.085755 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.085737 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742"} err="failed to get container status \"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742\": rpc error: code = NotFound desc = could not find container \"47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742\": container with ID starting with 47e30a1d0856bcf986107117e5559d0c15272f7b9ad575584184f19e09770742 not found: ID does not exist" Apr 16 18:06:48.085813 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.085756 2578 scope.go:117] "RemoveContainer" containerID="f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8" Apr 16 18:06:48.085966 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.085949 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8"} err="failed to get container status \"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8\": rpc error: code = NotFound desc = could not find container \"f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8\": container with ID starting with f8904e120d1e570f1f03aad3aeea105545b083ca076adc40dc160ff0c0d483d8 not found: ID does not exist" Apr 16 18:06:48.086018 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.085967 2578 scope.go:117] "RemoveContainer" containerID="ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f" Apr 16 18:06:48.086163 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.086146 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f"} err="failed to get container status \"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f\": rpc error: code = NotFound desc = could not find container \"ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f\": container with ID starting with ea7484afe08900ad58c07ff4659f92f499b888b5e4b8aec4847957d7fcfc368f not found: ID does not exist" Apr 16 18:06:48.094419 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094390 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:06:48.094674 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094660 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="thanos-sidecar" Apr 16 18:06:48.094674 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094675 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="thanos-sidecar" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094684 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="kube-rbac-proxy" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094689 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="kube-rbac-proxy" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094697 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="kube-rbac-proxy-web" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094702 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="kube-rbac-proxy-web" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094711 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad99f7e5-4161-446e-ad87-82c8c5677f5f" containerName="registry" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094717 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad99f7e5-4161-446e-ad87-82c8c5677f5f" containerName="registry" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094725 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="config-reloader" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094730 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="config-reloader" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094738 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="prometheus" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094743 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="prometheus" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094749 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="kube-rbac-proxy-thanos" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094754 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="kube-rbac-proxy-thanos" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094762 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="init-config-reloader" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094768 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="init-config-reloader" Apr 16 18:06:48.094796 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094803 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="thanos-sidecar" Apr 16 18:06:48.095297 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094810 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="kube-rbac-proxy" Apr 16 18:06:48.095297 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094817 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad99f7e5-4161-446e-ad87-82c8c5677f5f" containerName="registry" Apr 16 18:06:48.095297 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094824 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="kube-rbac-proxy-web" Apr 16 18:06:48.095297 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094831 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="config-reloader" Apr 16 18:06:48.095297 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094837 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="kube-rbac-proxy-thanos" Apr 16 18:06:48.095297 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.094843 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="15117bda-c3e8-4779-b823-535e488d3664" containerName="prometheus" Apr 16 18:06:48.097947 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.097933 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.101004 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.100985 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-9lnm6\"" Apr 16 18:06:48.101088 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.101008 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:06:48.101088 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.101059 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:06:48.101524 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.101505 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:06:48.101703 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.101681 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3v5jig657vmeb\"" Apr 16 18:06:48.101799 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.101777 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:06:48.101854 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.101797 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:06:48.101854 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.101812 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:06:48.102332 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.102316 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:06:48.102449 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.102346 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:06:48.102449 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.102353 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:06:48.103569 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.103524 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:06:48.105279 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.105260 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:06:48.108262 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.108246 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:06:48.112205 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.112186 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:06:48.114972 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.114953 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115040 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.114980 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115040 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115000 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115040 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115016 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115040 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115036 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115206 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115084 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7qz2\" (UniqueName: \"kubernetes.io/projected/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-kube-api-access-n7qz2\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115206 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115164 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115206 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115310 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115310 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115227 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115310 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115249 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115310 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115281 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115310 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115297 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-web-config\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115310 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115553 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115326 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-config-out\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115553 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-config\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115553 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115428 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.115553 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.115469 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.216795 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.216766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.216795 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.216798 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.216944 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.216820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-web-config\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.216944 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.216844 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.216944 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.216868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-config-out\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.216944 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.216895 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-config\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.216944 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.216937 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.217157 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.217083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.217210 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.217152 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.217210 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.217180 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.217585 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.217561 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.217854 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.217694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.217928 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.217881 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.217928 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.217916 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.218042 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.217940 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7qz2\" (UniqueName: \"kubernetes.io/projected/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-kube-api-access-n7qz2\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.218042 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.217985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.218042 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.218019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.218213 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.218054 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.218213 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.218056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.218213 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.218114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.218213 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.218170 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.219497 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.219118 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.220725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.220234 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.220725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.220265 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-config\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.220725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.220368 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-web-config\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.220725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.220438 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.220725 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.220638 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.221468 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.221411 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.221917 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.221883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.222300 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.222273 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.222746 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.222721 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.222827 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.222743 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.223085 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.223068 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.223152 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.223087 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-config-out\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.224613 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.224580 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.227214 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.227198 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7qz2\" (UniqueName: \"kubernetes.io/projected/dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da-kube-api-access-n7qz2\") pod \"prometheus-k8s-0\" (UID: \"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.254708 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.254674 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15117bda-c3e8-4779-b823-535e488d3664" path="/var/lib/kubelet/pods/15117bda-c3e8-4779-b823-535e488d3664/volumes" Apr 16 18:06:48.408636 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.408525 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:48.532980 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:48.532956 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:06:48.535188 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:06:48.535156 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbdfbd1a_5caf_430c_9dc2_4e8bc6d974da.slice/crio-ee905ea44aeba1978f29aa05db4e8f31e76b6a3506eb65b8d56e9763f352d8bb WatchSource:0}: Error finding container ee905ea44aeba1978f29aa05db4e8f31e76b6a3506eb65b8d56e9763f352d8bb: Status 404 returned error can't find the container with id ee905ea44aeba1978f29aa05db4e8f31e76b6a3506eb65b8d56e9763f352d8bb Apr 16 18:06:49.033090 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:49.033046 2578 generic.go:358] "Generic (PLEG): container finished" podID="dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da" containerID="580aca9488abf937c4efea078549eb7e1a30d9890c58121133904f064f9b91d3" exitCode=0 Apr 16 18:06:49.033495 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:49.033145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da","Type":"ContainerDied","Data":"580aca9488abf937c4efea078549eb7e1a30d9890c58121133904f064f9b91d3"} Apr 16 18:06:49.033495 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:49.033176 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da","Type":"ContainerStarted","Data":"ee905ea44aeba1978f29aa05db4e8f31e76b6a3506eb65b8d56e9763f352d8bb"} Apr 16 18:06:50.039859 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:50.039826 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da","Type":"ContainerStarted","Data":"2f7b8c317fdfd8075ffbe750337f75601a78c0e82ff2c714be1cda3bdd767273"} Apr 16 18:06:50.039859 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:50.039863 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da","Type":"ContainerStarted","Data":"87d1bf56cacb4df7c2ccfeadbf70e667cbf34c50a8718ffc44fe8f08dc1a3848"} Apr 16 18:06:50.040280 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:50.039872 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da","Type":"ContainerStarted","Data":"7315d42405c76c3c1b61f92d81496c8444e238ea9dd952c87772cc510543f8ec"} Apr 16 18:06:50.040280 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:50.039883 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da","Type":"ContainerStarted","Data":"88ef09b6f5adafb3e6e5f0a0b6e8179388278e8840e26e68c231522100746a3e"} Apr 16 18:06:50.040280 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:50.039896 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da","Type":"ContainerStarted","Data":"09ddcd328dce7455ef82b4c613271cb102a6c85ac81fc61bc685a3924eb12087"} Apr 16 18:06:50.040280 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:50.039904 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da","Type":"ContainerStarted","Data":"ccc91b3c373578c3518f25fa685c562d444ba2309a66057ed881cbaeb5117fa2"} Apr 16 18:06:50.071159 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:50.071119 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.071103591 podStartE2EDuration="2.071103591s" podCreationTimestamp="2026-04-16 18:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:06:50.068533974 +0000 UTC m=+280.372267384" watchObservedRunningTime="2026-04-16 18:06:50.071103591 +0000 UTC m=+280.374836985" Apr 16 18:06:50.689662 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:06:50.689587 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kf467" podUID="b360b9d8-466f-4200-b65e-ee3078c85b2f" Apr 16 18:06:50.689662 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:06:50.689621 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-l5w8p" podUID="bf5f2e9a-c558-4494-9601-e1ad83f7056e" Apr 16 18:06:51.043088 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:51.043057 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l5w8p" Apr 16 18:06:51.043514 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:51.043061 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:06:53.409378 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:53.409344 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:54.060505 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:54.060475 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:06:54.060683 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:54.060526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:06:54.062973 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:54.062944 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5f2e9a-c558-4494-9601-e1ad83f7056e-metrics-tls\") pod \"dns-default-l5w8p\" (UID: \"bf5f2e9a-c558-4494-9601-e1ad83f7056e\") " pod="openshift-dns/dns-default-l5w8p" Apr 16 18:06:54.063099 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:54.063077 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b360b9d8-466f-4200-b65e-ee3078c85b2f-cert\") pod \"ingress-canary-kf467\" (UID: \"b360b9d8-466f-4200-b65e-ee3078c85b2f\") " pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:06:54.348159 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:54.348071 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-j5p9z\"" Apr 16 18:06:54.348159 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:54.348071 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dxq2x\"" Apr 16 18:06:54.354502 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:54.354484 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kf467" Apr 16 18:06:54.354593 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:54.354579 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l5w8p" Apr 16 18:06:54.480133 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:54.480083 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l5w8p"] Apr 16 18:06:54.483889 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:06:54.483854 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf5f2e9a_c558_4494_9601_e1ad83f7056e.slice/crio-b1cd2911d08792f56e77eddeb127c4ee5686f6c5f501530c6d6bec9ec578436e WatchSource:0}: Error finding container b1cd2911d08792f56e77eddeb127c4ee5686f6c5f501530c6d6bec9ec578436e: Status 404 returned error can't find the container with id b1cd2911d08792f56e77eddeb127c4ee5686f6c5f501530c6d6bec9ec578436e Apr 16 18:06:54.496200 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:54.496172 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kf467"] Apr 16 18:06:54.498734 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:06:54.498709 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb360b9d8_466f_4200_b65e_ee3078c85b2f.slice/crio-11f29962f91fd5446a67c31bb4b7c6517e31894eb99da9b6d3fa0c17992c4590 WatchSource:0}: Error finding container 11f29962f91fd5446a67c31bb4b7c6517e31894eb99da9b6d3fa0c17992c4590: Status 404 returned error can't find the container with id 11f29962f91fd5446a67c31bb4b7c6517e31894eb99da9b6d3fa0c17992c4590 Apr 16 18:06:55.056147 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:55.056101 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kf467" event={"ID":"b360b9d8-466f-4200-b65e-ee3078c85b2f","Type":"ContainerStarted","Data":"11f29962f91fd5446a67c31bb4b7c6517e31894eb99da9b6d3fa0c17992c4590"} Apr 16 18:06:55.057268 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:55.057238 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l5w8p" event={"ID":"bf5f2e9a-c558-4494-9601-e1ad83f7056e","Type":"ContainerStarted","Data":"b1cd2911d08792f56e77eddeb127c4ee5686f6c5f501530c6d6bec9ec578436e"} Apr 16 18:06:57.064208 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:57.064159 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kf467" event={"ID":"b360b9d8-466f-4200-b65e-ee3078c85b2f","Type":"ContainerStarted","Data":"d6ff6fddc95d76242b295e4ffdfe9545c1d3d7b9b8fffeb1117abb8398a1887e"} Apr 16 18:06:57.065742 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:57.065714 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l5w8p" event={"ID":"bf5f2e9a-c558-4494-9601-e1ad83f7056e","Type":"ContainerStarted","Data":"38ccb73f5be9176e6c499ca1559e0e788c63da69e6835257c2ed69dec5bac89a"} Apr 16 18:06:57.065742 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:57.065744 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l5w8p" event={"ID":"bf5f2e9a-c558-4494-9601-e1ad83f7056e","Type":"ContainerStarted","Data":"ce3bbeac7368f6709490d1f7ebdf0a8e91e2ed93969e3b1c36a1839399f601eb"} Apr 16 18:06:57.065899 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:57.065839 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-l5w8p" Apr 16 18:06:57.083398 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:57.083356 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kf467" podStartSLOduration=251.136422686 podStartE2EDuration="4m13.083343867s" podCreationTimestamp="2026-04-16 18:02:44 +0000 UTC" firstStartedPulling="2026-04-16 18:06:54.500184757 +0000 UTC m=+284.803918129" lastFinishedPulling="2026-04-16 18:06:56.447105922 +0000 UTC m=+286.750839310" observedRunningTime="2026-04-16 18:06:57.081852902 +0000 UTC m=+287.385586296" watchObservedRunningTime="2026-04-16 18:06:57.083343867 +0000 UTC m=+287.387077262" Apr 16 18:06:57.106798 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:06:57.106742 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-l5w8p" podStartSLOduration=251.143923474 podStartE2EDuration="4m13.106725938s" podCreationTimestamp="2026-04-16 18:02:44 +0000 UTC" firstStartedPulling="2026-04-16 18:06:54.485752877 +0000 UTC m=+284.789486250" lastFinishedPulling="2026-04-16 18:06:56.448555342 +0000 UTC m=+286.752288714" observedRunningTime="2026-04-16 18:06:57.106289636 +0000 UTC m=+287.410023033" watchObservedRunningTime="2026-04-16 18:06:57.106725938 +0000 UTC m=+287.410459333" Apr 16 18:07:07.071350 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:07:07.071322 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-l5w8p" Apr 16 18:07:10.137920 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:07:10.137898 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:07:48.409437 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:07:48.409388 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:07:48.425201 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:07:48.425174 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:07:49.218498 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:07:49.218474 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:15:09.609040 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:09.609003 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-84rgk"] Apr 16 18:15:09.611994 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:09.611978 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-84rgk" Apr 16 18:15:09.614883 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:09.614854 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:15:09.615010 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:09.614921 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:15:09.615010 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:09.614932 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:15:09.615010 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:09.614933 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-mpsp8\"" Apr 16 18:15:09.619650 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:09.619590 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-84rgk"] Apr 16 18:15:09.720790 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:09.720755 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtqk7\" (UniqueName: \"kubernetes.io/projected/bab1736b-0759-4f5a-8406-d90803ac2552-kube-api-access-wtqk7\") pod \"s3-init-84rgk\" (UID: \"bab1736b-0759-4f5a-8406-d90803ac2552\") " pod="kserve/s3-init-84rgk" Apr 16 18:15:09.821667 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:09.821622 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtqk7\" (UniqueName: \"kubernetes.io/projected/bab1736b-0759-4f5a-8406-d90803ac2552-kube-api-access-wtqk7\") pod \"s3-init-84rgk\" (UID: \"bab1736b-0759-4f5a-8406-d90803ac2552\") " pod="kserve/s3-init-84rgk" Apr 16 18:15:09.830869 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:09.830842 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtqk7\" (UniqueName: \"kubernetes.io/projected/bab1736b-0759-4f5a-8406-d90803ac2552-kube-api-access-wtqk7\") pod \"s3-init-84rgk\" (UID: \"bab1736b-0759-4f5a-8406-d90803ac2552\") " pod="kserve/s3-init-84rgk" Apr 16 18:15:09.921535 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:09.921449 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-84rgk" Apr 16 18:15:10.045352 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:10.045322 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-84rgk"] Apr 16 18:15:10.047982 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:15:10.047953 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbab1736b_0759_4f5a_8406_d90803ac2552.slice/crio-9eead049207b7fa3687acbb73a48c93dcb2414a77ac5a18044b779a12bb190a1 WatchSource:0}: Error finding container 9eead049207b7fa3687acbb73a48c93dcb2414a77ac5a18044b779a12bb190a1: Status 404 returned error can't find the container with id 9eead049207b7fa3687acbb73a48c93dcb2414a77ac5a18044b779a12bb190a1 Apr 16 18:15:10.049815 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:10.049796 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:15:10.368061 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:10.368027 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-84rgk" event={"ID":"bab1736b-0759-4f5a-8406-d90803ac2552","Type":"ContainerStarted","Data":"9eead049207b7fa3687acbb73a48c93dcb2414a77ac5a18044b779a12bb190a1"} Apr 16 18:15:14.515456 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:14.515434 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:15:15.383759 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:15.383717 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-84rgk" event={"ID":"bab1736b-0759-4f5a-8406-d90803ac2552","Type":"ContainerStarted","Data":"16b518634f39c87df0076231bc633ac0808b39af26a6dbddc4b63835c39e0741"} Apr 16 18:15:15.414703 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:15.414636 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-84rgk" podStartSLOduration=1.952352137 podStartE2EDuration="6.41461561s" podCreationTimestamp="2026-04-16 18:15:09 +0000 UTC" firstStartedPulling="2026-04-16 18:15:10.049970303 +0000 UTC m=+780.353703681" lastFinishedPulling="2026-04-16 18:15:14.512233775 +0000 UTC m=+784.815967154" observedRunningTime="2026-04-16 18:15:15.413117268 +0000 UTC m=+785.716850666" watchObservedRunningTime="2026-04-16 18:15:15.41461561 +0000 UTC m=+785.718349004" Apr 16 18:15:18.391377 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:18.391337 2578 generic.go:358] "Generic (PLEG): container finished" podID="bab1736b-0759-4f5a-8406-d90803ac2552" containerID="16b518634f39c87df0076231bc633ac0808b39af26a6dbddc4b63835c39e0741" exitCode=0 Apr 16 18:15:18.391786 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:18.391410 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-84rgk" event={"ID":"bab1736b-0759-4f5a-8406-d90803ac2552","Type":"ContainerDied","Data":"16b518634f39c87df0076231bc633ac0808b39af26a6dbddc4b63835c39e0741"} Apr 16 18:15:19.516273 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:19.516253 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-84rgk" Apr 16 18:15:19.597947 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:19.597923 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtqk7\" (UniqueName: \"kubernetes.io/projected/bab1736b-0759-4f5a-8406-d90803ac2552-kube-api-access-wtqk7\") pod \"bab1736b-0759-4f5a-8406-d90803ac2552\" (UID: \"bab1736b-0759-4f5a-8406-d90803ac2552\") " Apr 16 18:15:19.600156 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:19.600127 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bab1736b-0759-4f5a-8406-d90803ac2552-kube-api-access-wtqk7" (OuterVolumeSpecName: "kube-api-access-wtqk7") pod "bab1736b-0759-4f5a-8406-d90803ac2552" (UID: "bab1736b-0759-4f5a-8406-d90803ac2552"). InnerVolumeSpecName "kube-api-access-wtqk7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:15:19.698568 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:19.698498 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wtqk7\" (UniqueName: \"kubernetes.io/projected/bab1736b-0759-4f5a-8406-d90803ac2552-kube-api-access-wtqk7\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:15:20.397385 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:20.397354 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-84rgk" Apr 16 18:15:20.397540 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:20.397353 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-84rgk" event={"ID":"bab1736b-0759-4f5a-8406-d90803ac2552","Type":"ContainerDied","Data":"9eead049207b7fa3687acbb73a48c93dcb2414a77ac5a18044b779a12bb190a1"} Apr 16 18:15:20.397540 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:20.397471 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eead049207b7fa3687acbb73a48c93dcb2414a77ac5a18044b779a12bb190a1" Apr 16 18:15:30.576556 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:30.576477 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95"] Apr 16 18:15:30.576907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:30.576826 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bab1736b-0759-4f5a-8406-d90803ac2552" containerName="s3-init" Apr 16 18:15:30.576907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:30.576838 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab1736b-0759-4f5a-8406-d90803ac2552" containerName="s3-init" Apr 16 18:15:30.576907 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:30.576894 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bab1736b-0759-4f5a-8406-d90803ac2552" containerName="s3-init" Apr 16 18:15:30.578691 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:30.578676 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" Apr 16 18:15:30.582925 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:30.582905 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-zqf8f\"" Apr 16 18:15:30.609980 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:30.609956 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95"] Apr 16 18:15:30.674831 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:30.674793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c07179db-ff3e-446a-be1f-c11a701a6f98-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95\" (UID: \"c07179db-ff3e-446a-be1f-c11a701a6f98\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" Apr 16 18:15:30.775139 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:30.775097 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c07179db-ff3e-446a-be1f-c11a701a6f98-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95\" (UID: \"c07179db-ff3e-446a-be1f-c11a701a6f98\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" Apr 16 18:15:30.775437 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:30.775420 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c07179db-ff3e-446a-be1f-c11a701a6f98-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95\" (UID: \"c07179db-ff3e-446a-be1f-c11a701a6f98\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" Apr 16 18:15:30.887710 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:30.887628 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" Apr 16 18:15:31.062683 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:31.062661 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95"] Apr 16 18:15:31.065181 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:15:31.065151 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc07179db_ff3e_446a_be1f_c11a701a6f98.slice/crio-99ab2589864d392d181f10f181f03cc5a0fe121051ea871e8c39132138555707 WatchSource:0}: Error finding container 99ab2589864d392d181f10f181f03cc5a0fe121051ea871e8c39132138555707: Status 404 returned error can't find the container with id 99ab2589864d392d181f10f181f03cc5a0fe121051ea871e8c39132138555707 Apr 16 18:15:31.427217 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:31.427178 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" event={"ID":"c07179db-ff3e-446a-be1f-c11a701a6f98","Type":"ContainerStarted","Data":"99ab2589864d392d181f10f181f03cc5a0fe121051ea871e8c39132138555707"} Apr 16 18:15:35.439714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:35.439680 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" event={"ID":"c07179db-ff3e-446a-be1f-c11a701a6f98","Type":"ContainerStarted","Data":"7383dab6a6e6e107689507942c73908deeaeed7162b381fe0587d5e120a0c685"} Apr 16 18:15:39.450290 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:39.450258 2578 generic.go:358] "Generic (PLEG): container finished" podID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerID="7383dab6a6e6e107689507942c73908deeaeed7162b381fe0587d5e120a0c685" exitCode=0 Apr 16 18:15:39.450643 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:39.450333 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" event={"ID":"c07179db-ff3e-446a-be1f-c11a701a6f98","Type":"ContainerDied","Data":"7383dab6a6e6e107689507942c73908deeaeed7162b381fe0587d5e120a0c685"} Apr 16 18:15:53.498849 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:53.498813 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" event={"ID":"c07179db-ff3e-446a-be1f-c11a701a6f98","Type":"ContainerStarted","Data":"a09299399afc242717fc367db4fb799c563399c18e2ee6a789516e446efb9bfe"} Apr 16 18:15:53.499302 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:53.499141 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" Apr 16 18:15:53.500339 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:53.500313 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 18:15:53.518122 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:53.518077 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" podStartSLOduration=2.003766953 podStartE2EDuration="23.518063754s" podCreationTimestamp="2026-04-16 18:15:30 +0000 UTC" firstStartedPulling="2026-04-16 18:15:31.067099339 +0000 UTC m=+801.370832715" lastFinishedPulling="2026-04-16 18:15:52.58139614 +0000 UTC m=+822.885129516" observedRunningTime="2026-04-16 18:15:53.516890857 +0000 UTC m=+823.820624252" watchObservedRunningTime="2026-04-16 18:15:53.518063754 +0000 UTC m=+823.821797150" Apr 16 18:15:54.501588 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:15:54.501547 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 18:16:04.502136 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:16:04.502091 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 18:16:14.501891 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:16:14.501851 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 18:16:24.502155 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:16:24.502110 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 18:16:34.502574 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:16:34.502529 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 18:16:44.502454 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:16:44.502408 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 18:16:54.502323 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:16:54.502230 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 18:16:59.251727 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:16:59.251700 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" Apr 16 18:17:30.599337 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:30.599305 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95"] Apr 16 18:17:30.599814 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:30.599593 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="kserve-container" containerID="cri-o://a09299399afc242717fc367db4fb799c563399c18e2ee6a789516e446efb9bfe" gracePeriod=30 Apr 16 18:17:34.536057 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.536035 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" Apr 16 18:17:34.638712 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.638588 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c07179db-ff3e-446a-be1f-c11a701a6f98-kserve-provision-location\") pod \"c07179db-ff3e-446a-be1f-c11a701a6f98\" (UID: \"c07179db-ff3e-446a-be1f-c11a701a6f98\") " Apr 16 18:17:34.638957 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.638931 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c07179db-ff3e-446a-be1f-c11a701a6f98-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c07179db-ff3e-446a-be1f-c11a701a6f98" (UID: "c07179db-ff3e-446a-be1f-c11a701a6f98"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:17:34.740017 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.739961 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c07179db-ff3e-446a-be1f-c11a701a6f98-kserve-provision-location\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:17:34.773755 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.773719 2578 generic.go:358] "Generic (PLEG): container finished" podID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerID="a09299399afc242717fc367db4fb799c563399c18e2ee6a789516e446efb9bfe" exitCode=0 Apr 16 18:17:34.773926 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.773768 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" event={"ID":"c07179db-ff3e-446a-be1f-c11a701a6f98","Type":"ContainerDied","Data":"a09299399afc242717fc367db4fb799c563399c18e2ee6a789516e446efb9bfe"} Apr 16 18:17:34.773926 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.773786 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" Apr 16 18:17:34.773926 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.773795 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95" event={"ID":"c07179db-ff3e-446a-be1f-c11a701a6f98","Type":"ContainerDied","Data":"99ab2589864d392d181f10f181f03cc5a0fe121051ea871e8c39132138555707"} Apr 16 18:17:34.773926 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.773812 2578 scope.go:117] "RemoveContainer" containerID="a09299399afc242717fc367db4fb799c563399c18e2ee6a789516e446efb9bfe" Apr 16 18:17:34.781624 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.781583 2578 scope.go:117] "RemoveContainer" containerID="7383dab6a6e6e107689507942c73908deeaeed7162b381fe0587d5e120a0c685" Apr 16 18:17:34.789030 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.789011 2578 scope.go:117] "RemoveContainer" containerID="a09299399afc242717fc367db4fb799c563399c18e2ee6a789516e446efb9bfe" Apr 16 18:17:34.789323 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:17:34.789302 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09299399afc242717fc367db4fb799c563399c18e2ee6a789516e446efb9bfe\": container with ID starting with a09299399afc242717fc367db4fb799c563399c18e2ee6a789516e446efb9bfe not found: ID does not exist" containerID="a09299399afc242717fc367db4fb799c563399c18e2ee6a789516e446efb9bfe" Apr 16 18:17:34.789360 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.789335 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09299399afc242717fc367db4fb799c563399c18e2ee6a789516e446efb9bfe"} err="failed to get container status \"a09299399afc242717fc367db4fb799c563399c18e2ee6a789516e446efb9bfe\": rpc error: code = NotFound desc = could not find container \"a09299399afc242717fc367db4fb799c563399c18e2ee6a789516e446efb9bfe\": container with ID starting with a09299399afc242717fc367db4fb799c563399c18e2ee6a789516e446efb9bfe not found: ID does not exist" Apr 16 18:17:34.789360 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.789355 2578 scope.go:117] "RemoveContainer" containerID="7383dab6a6e6e107689507942c73908deeaeed7162b381fe0587d5e120a0c685" Apr 16 18:17:34.789592 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:17:34.789572 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7383dab6a6e6e107689507942c73908deeaeed7162b381fe0587d5e120a0c685\": container with ID starting with 7383dab6a6e6e107689507942c73908deeaeed7162b381fe0587d5e120a0c685 not found: ID does not exist" containerID="7383dab6a6e6e107689507942c73908deeaeed7162b381fe0587d5e120a0c685" Apr 16 18:17:34.789650 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.789618 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7383dab6a6e6e107689507942c73908deeaeed7162b381fe0587d5e120a0c685"} err="failed to get container status \"7383dab6a6e6e107689507942c73908deeaeed7162b381fe0587d5e120a0c685\": rpc error: code = NotFound desc = could not find container \"7383dab6a6e6e107689507942c73908deeaeed7162b381fe0587d5e120a0c685\": container with ID starting with 7383dab6a6e6e107689507942c73908deeaeed7162b381fe0587d5e120a0c685 not found: ID does not exist" Apr 16 18:17:34.796247 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.796222 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95"] Apr 16 18:17:34.800010 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:34.799990 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-xqs95"] Apr 16 18:17:36.253880 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:17:36.253848 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" path="/var/lib/kubelet/pods/c07179db-ff3e-446a-be1f-c11a701a6f98/volumes" Apr 16 18:55:11.613559 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.613522 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6k77/must-gather-bkdzf"] Apr 16 18:55:11.615995 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.613784 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="kserve-container" Apr 16 18:55:11.615995 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.613796 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="kserve-container" Apr 16 18:55:11.615995 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.613811 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="storage-initializer" Apr 16 18:55:11.615995 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.613817 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="storage-initializer" Apr 16 18:55:11.615995 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.613854 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c07179db-ff3e-446a-be1f-c11a701a6f98" containerName="kserve-container" Apr 16 18:55:11.616917 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.616901 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6k77/must-gather-bkdzf" Apr 16 18:55:11.620257 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.620233 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d6k77\"/\"default-dockercfg-d8w2x\"" Apr 16 18:55:11.620579 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.620563 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d6k77\"/\"openshift-service-ca.crt\"" Apr 16 18:55:11.620918 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.620904 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d6k77\"/\"kube-root-ca.crt\"" Apr 16 18:55:11.635830 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.635799 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d6k77/must-gather-bkdzf"] Apr 16 18:55:11.683575 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.683549 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw9gb\" (UniqueName: \"kubernetes.io/projected/e850988e-8daf-4edc-9180-11c0943f15ea-kube-api-access-mw9gb\") pod \"must-gather-bkdzf\" (UID: \"e850988e-8daf-4edc-9180-11c0943f15ea\") " pod="openshift-must-gather-d6k77/must-gather-bkdzf" Apr 16 18:55:11.683696 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.683580 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e850988e-8daf-4edc-9180-11c0943f15ea-must-gather-output\") pod \"must-gather-bkdzf\" (UID: \"e850988e-8daf-4edc-9180-11c0943f15ea\") " pod="openshift-must-gather-d6k77/must-gather-bkdzf" Apr 16 18:55:11.784450 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.784430 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mw9gb\" (UniqueName: \"kubernetes.io/projected/e850988e-8daf-4edc-9180-11c0943f15ea-kube-api-access-mw9gb\") pod \"must-gather-bkdzf\" (UID: \"e850988e-8daf-4edc-9180-11c0943f15ea\") " pod="openshift-must-gather-d6k77/must-gather-bkdzf" Apr 16 18:55:11.784559 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.784466 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e850988e-8daf-4edc-9180-11c0943f15ea-must-gather-output\") pod \"must-gather-bkdzf\" (UID: \"e850988e-8daf-4edc-9180-11c0943f15ea\") " pod="openshift-must-gather-d6k77/must-gather-bkdzf" Apr 16 18:55:11.784806 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.784777 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e850988e-8daf-4edc-9180-11c0943f15ea-must-gather-output\") pod \"must-gather-bkdzf\" (UID: \"e850988e-8daf-4edc-9180-11c0943f15ea\") " pod="openshift-must-gather-d6k77/must-gather-bkdzf" Apr 16 18:55:11.794027 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.794004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw9gb\" (UniqueName: \"kubernetes.io/projected/e850988e-8daf-4edc-9180-11c0943f15ea-kube-api-access-mw9gb\") pod \"must-gather-bkdzf\" (UID: \"e850988e-8daf-4edc-9180-11c0943f15ea\") " pod="openshift-must-gather-d6k77/must-gather-bkdzf" Apr 16 18:55:11.925156 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:11.925085 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6k77/must-gather-bkdzf" Apr 16 18:55:12.046957 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:12.046927 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d6k77/must-gather-bkdzf"] Apr 16 18:55:12.050211 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:55:12.050180 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode850988e_8daf_4edc_9180_11c0943f15ea.slice/crio-ba8ac8d72caaf2f705761a55e6c6509d7d0dde80bdb43ba39c5c4f6c9911f080 WatchSource:0}: Error finding container ba8ac8d72caaf2f705761a55e6c6509d7d0dde80bdb43ba39c5c4f6c9911f080: Status 404 returned error can't find the container with id ba8ac8d72caaf2f705761a55e6c6509d7d0dde80bdb43ba39c5c4f6c9911f080 Apr 16 18:55:12.051930 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:12.051915 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:55:12.681185 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:12.681142 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6k77/must-gather-bkdzf" event={"ID":"e850988e-8daf-4edc-9180-11c0943f15ea","Type":"ContainerStarted","Data":"ba8ac8d72caaf2f705761a55e6c6509d7d0dde80bdb43ba39c5c4f6c9911f080"} Apr 16 18:55:16.695674 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:16.695635 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6k77/must-gather-bkdzf" event={"ID":"e850988e-8daf-4edc-9180-11c0943f15ea","Type":"ContainerStarted","Data":"8a1fe814a65937e23669c7d5478d49cd7549469e5f001208550ae1028de448bd"} Apr 16 18:55:16.695674 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:16.695676 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6k77/must-gather-bkdzf" event={"ID":"e850988e-8daf-4edc-9180-11c0943f15ea","Type":"ContainerStarted","Data":"c02cc29254c92d83df092a83c69aece89bfa47cff2c004ba3f52647254cfc9be"} Apr 16 18:55:16.713125 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:16.713075 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d6k77/must-gather-bkdzf" podStartSLOduration=1.648043972 podStartE2EDuration="5.713060605s" podCreationTimestamp="2026-04-16 18:55:11 +0000 UTC" firstStartedPulling="2026-04-16 18:55:12.052032565 +0000 UTC m=+3182.355765939" lastFinishedPulling="2026-04-16 18:55:16.117049199 +0000 UTC m=+3186.420782572" observedRunningTime="2026-04-16 18:55:16.712340513 +0000 UTC m=+3187.016073907" watchObservedRunningTime="2026-04-16 18:55:16.713060605 +0000 UTC m=+3187.016793999" Apr 16 18:55:34.745822 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:34.745780 2578 generic.go:358] "Generic (PLEG): container finished" podID="e850988e-8daf-4edc-9180-11c0943f15ea" containerID="c02cc29254c92d83df092a83c69aece89bfa47cff2c004ba3f52647254cfc9be" exitCode=0 Apr 16 18:55:34.746248 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:34.745856 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6k77/must-gather-bkdzf" event={"ID":"e850988e-8daf-4edc-9180-11c0943f15ea","Type":"ContainerDied","Data":"c02cc29254c92d83df092a83c69aece89bfa47cff2c004ba3f52647254cfc9be"} Apr 16 18:55:34.746248 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:34.746197 2578 scope.go:117] "RemoveContainer" containerID="c02cc29254c92d83df092a83c69aece89bfa47cff2c004ba3f52647254cfc9be" Apr 16 18:55:35.504207 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:35.504176 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6k77_must-gather-bkdzf_e850988e-8daf-4edc-9180-11c0943f15ea/gather/0.log" Apr 16 18:55:38.907723 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:38.907694 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pwdrb_9d78cda3-41b6-4a64-9426-20bac8b58ad1/global-pull-secret-syncer/0.log" Apr 16 18:55:38.991469 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:38.991437 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9gw8r_3b31ea4d-734b-4f90-a86a-3706e8c5d58f/konnectivity-agent/0.log" Apr 16 18:55:39.097046 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:39.097016 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-55.ec2.internal_eceb33926282e4ef98e3ef9d13dc120a/haproxy/0.log" Apr 16 18:55:40.933936 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:40.933901 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6k77/must-gather-bkdzf"] Apr 16 18:55:40.934446 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:40.934128 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-d6k77/must-gather-bkdzf" podUID="e850988e-8daf-4edc-9180-11c0943f15ea" containerName="copy" containerID="cri-o://8a1fe814a65937e23669c7d5478d49cd7549469e5f001208550ae1028de448bd" gracePeriod=2 Apr 16 18:55:40.940446 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:40.940419 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6k77/must-gather-bkdzf"] Apr 16 18:55:41.158084 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.158063 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6k77_must-gather-bkdzf_e850988e-8daf-4edc-9180-11c0943f15ea/copy/0.log" Apr 16 18:55:41.158413 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.158398 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6k77/must-gather-bkdzf" Apr 16 18:55:41.161280 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.161257 2578 status_manager.go:895] "Failed to get status for pod" podUID="e850988e-8daf-4edc-9180-11c0943f15ea" pod="openshift-must-gather-d6k77/must-gather-bkdzf" err="pods \"must-gather-bkdzf\" is forbidden: User \"system:node:ip-10-0-134-55.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-d6k77\": no relationship found between node 'ip-10-0-134-55.ec2.internal' and this object" Apr 16 18:55:41.334142 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.334100 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw9gb\" (UniqueName: \"kubernetes.io/projected/e850988e-8daf-4edc-9180-11c0943f15ea-kube-api-access-mw9gb\") pod \"e850988e-8daf-4edc-9180-11c0943f15ea\" (UID: \"e850988e-8daf-4edc-9180-11c0943f15ea\") " Apr 16 18:55:41.334313 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.334167 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e850988e-8daf-4edc-9180-11c0943f15ea-must-gather-output\") pod \"e850988e-8daf-4edc-9180-11c0943f15ea\" (UID: \"e850988e-8daf-4edc-9180-11c0943f15ea\") " Apr 16 18:55:41.335648 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.335596 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e850988e-8daf-4edc-9180-11c0943f15ea-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e850988e-8daf-4edc-9180-11c0943f15ea" (UID: "e850988e-8daf-4edc-9180-11c0943f15ea"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:41.336466 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.336444 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e850988e-8daf-4edc-9180-11c0943f15ea-kube-api-access-mw9gb" (OuterVolumeSpecName: "kube-api-access-mw9gb") pod "e850988e-8daf-4edc-9180-11c0943f15ea" (UID: "e850988e-8daf-4edc-9180-11c0943f15ea"). InnerVolumeSpecName "kube-api-access-mw9gb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:55:41.434800 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.434757 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mw9gb\" (UniqueName: \"kubernetes.io/projected/e850988e-8daf-4edc-9180-11c0943f15ea-kube-api-access-mw9gb\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:55:41.434800 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.434790 2578 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e850988e-8daf-4edc-9180-11c0943f15ea-must-gather-output\") on node \"ip-10-0-134-55.ec2.internal\" DevicePath \"\"" Apr 16 18:55:41.763613 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.763572 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6k77_must-gather-bkdzf_e850988e-8daf-4edc-9180-11c0943f15ea/copy/0.log" Apr 16 18:55:41.763918 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.763896 2578 generic.go:358] "Generic (PLEG): container finished" podID="e850988e-8daf-4edc-9180-11c0943f15ea" containerID="8a1fe814a65937e23669c7d5478d49cd7549469e5f001208550ae1028de448bd" exitCode=143 Apr 16 18:55:41.763981 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.763955 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6k77/must-gather-bkdzf" Apr 16 18:55:41.764021 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.763994 2578 scope.go:117] "RemoveContainer" containerID="8a1fe814a65937e23669c7d5478d49cd7549469e5f001208550ae1028de448bd" Apr 16 18:55:41.766934 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.766905 2578 status_manager.go:895] "Failed to get status for pod" podUID="e850988e-8daf-4edc-9180-11c0943f15ea" pod="openshift-must-gather-d6k77/must-gather-bkdzf" err="pods \"must-gather-bkdzf\" is forbidden: User \"system:node:ip-10-0-134-55.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-d6k77\": no relationship found between node 'ip-10-0-134-55.ec2.internal' and this object" Apr 16 18:55:41.771125 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.771102 2578 scope.go:117] "RemoveContainer" containerID="c02cc29254c92d83df092a83c69aece89bfa47cff2c004ba3f52647254cfc9be" Apr 16 18:55:41.774148 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.774124 2578 status_manager.go:895] "Failed to get status for pod" podUID="e850988e-8daf-4edc-9180-11c0943f15ea" pod="openshift-must-gather-d6k77/must-gather-bkdzf" err="pods \"must-gather-bkdzf\" is forbidden: User \"system:node:ip-10-0-134-55.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-d6k77\": no relationship found between node 'ip-10-0-134-55.ec2.internal' and this object" Apr 16 18:55:41.782663 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.782648 2578 scope.go:117] "RemoveContainer" containerID="8a1fe814a65937e23669c7d5478d49cd7549469e5f001208550ae1028de448bd" Apr 16 18:55:41.782916 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:55:41.782897 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a1fe814a65937e23669c7d5478d49cd7549469e5f001208550ae1028de448bd\": container with ID starting with 8a1fe814a65937e23669c7d5478d49cd7549469e5f001208550ae1028de448bd not found: ID does not exist" containerID="8a1fe814a65937e23669c7d5478d49cd7549469e5f001208550ae1028de448bd" Apr 16 18:55:41.782989 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.782923 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a1fe814a65937e23669c7d5478d49cd7549469e5f001208550ae1028de448bd"} err="failed to get container status \"8a1fe814a65937e23669c7d5478d49cd7549469e5f001208550ae1028de448bd\": rpc error: code = NotFound desc = could not find container \"8a1fe814a65937e23669c7d5478d49cd7549469e5f001208550ae1028de448bd\": container with ID starting with 8a1fe814a65937e23669c7d5478d49cd7549469e5f001208550ae1028de448bd not found: ID does not exist" Apr 16 18:55:41.782989 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.782942 2578 scope.go:117] "RemoveContainer" containerID="c02cc29254c92d83df092a83c69aece89bfa47cff2c004ba3f52647254cfc9be" Apr 16 18:55:41.783179 ip-10-0-134-55 kubenswrapper[2578]: E0416 18:55:41.783163 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c02cc29254c92d83df092a83c69aece89bfa47cff2c004ba3f52647254cfc9be\": container with ID starting with c02cc29254c92d83df092a83c69aece89bfa47cff2c004ba3f52647254cfc9be not found: ID does not exist" containerID="c02cc29254c92d83df092a83c69aece89bfa47cff2c004ba3f52647254cfc9be" Apr 16 18:55:41.783221 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:41.783184 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c02cc29254c92d83df092a83c69aece89bfa47cff2c004ba3f52647254cfc9be"} err="failed to get container status \"c02cc29254c92d83df092a83c69aece89bfa47cff2c004ba3f52647254cfc9be\": rpc error: code = NotFound desc = could not find container \"c02cc29254c92d83df092a83c69aece89bfa47cff2c004ba3f52647254cfc9be\": container with ID starting with c02cc29254c92d83df092a83c69aece89bfa47cff2c004ba3f52647254cfc9be not found: ID does not exist" Apr 16 18:55:42.254285 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:42.254252 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e850988e-8daf-4edc-9180-11c0943f15ea" path="/var/lib/kubelet/pods/e850988e-8daf-4edc-9180-11c0943f15ea/volumes" Apr 16 18:55:42.673381 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:42.673281 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-684947fd6b-tfgf9_d351123e-13f9-4b65-b989-8d01ad589016/metrics-server/0.log" Apr 16 18:55:42.743940 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:42.743909 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4mpdb_f4753fcd-d099-4568-8ab9-ba10a06ebc56/node-exporter/0.log" Apr 16 18:55:42.763560 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:42.763532 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4mpdb_f4753fcd-d099-4568-8ab9-ba10a06ebc56/kube-rbac-proxy/0.log" Apr 16 18:55:42.785142 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:42.785120 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4mpdb_f4753fcd-d099-4568-8ab9-ba10a06ebc56/init-textfile/0.log" Apr 16 18:55:43.068002 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:43.067970 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da/prometheus/0.log" Apr 16 18:55:43.085171 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:43.085143 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da/config-reloader/0.log" Apr 16 18:55:43.109186 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:43.109156 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da/thanos-sidecar/0.log" Apr 16 18:55:43.133664 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:43.133640 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da/kube-rbac-proxy-web/0.log" Apr 16 18:55:43.154261 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:43.154238 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da/kube-rbac-proxy/0.log" Apr 16 18:55:43.175982 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:43.175959 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da/kube-rbac-proxy-thanos/0.log" Apr 16 18:55:43.198776 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:43.198760 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_dbdfbd1a-5caf-430c-9dc2-4e8bc6d974da/init-config-reloader/0.log" Apr 16 18:55:43.299650 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:43.299621 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d655577fb-fhkvt_68ba8e3f-0e96-499d-bbf5-823e6b695520/telemeter-client/0.log" Apr 16 18:55:43.321896 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:43.321831 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d655577fb-fhkvt_68ba8e3f-0e96-499d-bbf5-823e6b695520/reload/0.log" Apr 16 18:55:43.342728 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:43.342704 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d655577fb-fhkvt_68ba8e3f-0e96-499d-bbf5-823e6b695520/kube-rbac-proxy/0.log" Apr 16 18:55:46.099416 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.099383 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt"] Apr 16 18:55:46.099793 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.099671 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e850988e-8daf-4edc-9180-11c0943f15ea" containerName="gather" Apr 16 18:55:46.099793 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.099683 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e850988e-8daf-4edc-9180-11c0943f15ea" containerName="gather" Apr 16 18:55:46.099793 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.099706 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e850988e-8daf-4edc-9180-11c0943f15ea" containerName="copy" Apr 16 18:55:46.099793 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.099711 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e850988e-8daf-4edc-9180-11c0943f15ea" containerName="copy" Apr 16 18:55:46.099793 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.099750 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e850988e-8daf-4edc-9180-11c0943f15ea" containerName="copy" Apr 16 18:55:46.099793 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.099759 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e850988e-8daf-4edc-9180-11c0943f15ea" containerName="gather" Apr 16 18:55:46.102495 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.102474 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.105411 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.105392 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vt288\"/\"default-dockercfg-nbxpr\"" Apr 16 18:55:46.106572 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.106553 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vt288\"/\"openshift-service-ca.crt\"" Apr 16 18:55:46.106674 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.106592 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vt288\"/\"kube-root-ca.crt\"" Apr 16 18:55:46.111707 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.111683 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt"] Apr 16 18:55:46.276905 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.276861 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-sys\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.276905 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.276906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-proc\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.277131 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.276988 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-podres\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.277131 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.277027 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-lib-modules\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.277131 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.277085 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwtm5\" (UniqueName: \"kubernetes.io/projected/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-kube-api-access-wwtm5\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.377750 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.377667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-podres\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.377750 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.377709 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-lib-modules\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.377750 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.377749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwtm5\" (UniqueName: \"kubernetes.io/projected/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-kube-api-access-wwtm5\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.377967 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.377779 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-sys\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.377967 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.377825 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-sys\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.377967 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.377851 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-proc\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.377967 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.377857 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-lib-modules\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.377967 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.377858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-podres\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.377967 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.377932 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-proc\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.386203 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.386173 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwtm5\" (UniqueName: \"kubernetes.io/projected/d517f24e-3f02-4cf3-bd19-b048bbb4cc43-kube-api-access-wwtm5\") pod \"perf-node-gather-daemonset-z9tkt\" (UID: \"d517f24e-3f02-4cf3-bd19-b048bbb4cc43\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.412127 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.412105 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.530776 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.530742 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt"] Apr 16 18:55:46.533734 ip-10-0-134-55 kubenswrapper[2578]: W0416 18:55:46.533705 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd517f24e_3f02_4cf3_bd19_b048bbb4cc43.slice/crio-90b6a7e3a5908a68d32941b4172cc4a601f8283025b04ec261c78e8d0d7953ec WatchSource:0}: Error finding container 90b6a7e3a5908a68d32941b4172cc4a601f8283025b04ec261c78e8d0d7953ec: Status 404 returned error can't find the container with id 90b6a7e3a5908a68d32941b4172cc4a601f8283025b04ec261c78e8d0d7953ec Apr 16 18:55:46.779615 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.779561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" event={"ID":"d517f24e-3f02-4cf3-bd19-b048bbb4cc43","Type":"ContainerStarted","Data":"8ddc133bbde505fc7013156da84e9a05591a26864e64e9510e05f80bfa68b9a4"} Apr 16 18:55:46.779615 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.779615 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" event={"ID":"d517f24e-3f02-4cf3-bd19-b048bbb4cc43","Type":"ContainerStarted","Data":"90b6a7e3a5908a68d32941b4172cc4a601f8283025b04ec261c78e8d0d7953ec"} Apr 16 18:55:46.779935 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.779681 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:46.799039 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.798995 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" podStartSLOduration=0.798979633 podStartE2EDuration="798.979633ms" podCreationTimestamp="2026-04-16 18:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:55:46.7980808 +0000 UTC m=+3217.101814210" watchObservedRunningTime="2026-04-16 18:55:46.798979633 +0000 UTC m=+3217.102713027" Apr 16 18:55:46.818694 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.818670 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l5w8p_bf5f2e9a-c558-4494-9601-e1ad83f7056e/dns/0.log" Apr 16 18:55:46.838745 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.838713 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l5w8p_bf5f2e9a-c558-4494-9601-e1ad83f7056e/kube-rbac-proxy/0.log" Apr 16 18:55:46.881779 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:46.881735 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lgc4z_179b0e84-c4b0-49f4-af8c-88127e5c999a/dns-node-resolver/0.log" Apr 16 18:55:47.451562 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:47.451537 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-85mvt_e2c3dbfa-ec70-406e-af1c-6ac3c1aba031/node-ca/0.log" Apr 16 18:55:48.613152 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:48.613122 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kf467_b360b9d8-466f-4200-b65e-ee3078c85b2f/serve-healthcheck-canary/0.log" Apr 16 18:55:49.026937 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:49.026914 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l8df6_d1a46e78-ff9f-4969-b4f5-a5a6f37dc199/kube-rbac-proxy/0.log" Apr 16 18:55:49.046623 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:49.046581 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l8df6_d1a46e78-ff9f-4969-b4f5-a5a6f37dc199/exporter/0.log" Apr 16 18:55:49.070591 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:49.070567 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l8df6_d1a46e78-ff9f-4969-b4f5-a5a6f37dc199/extractor/0.log" Apr 16 18:55:51.512043 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:51.512018 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-84rgk_bab1736b-0759-4f5a-8406-d90803ac2552/s3-init/0.log" Apr 16 18:55:52.791467 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:52.791434 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-z9tkt" Apr 16 18:55:57.569115 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:57.569085 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dgmt6_2d9e5152-75a1-43b4-9c2b-f42acb2075df/kube-multus-additional-cni-plugins/0.log" Apr 16 18:55:57.589509 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:57.589489 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dgmt6_2d9e5152-75a1-43b4-9c2b-f42acb2075df/egress-router-binary-copy/0.log" Apr 16 18:55:57.609381 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:57.609358 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dgmt6_2d9e5152-75a1-43b4-9c2b-f42acb2075df/cni-plugins/0.log" Apr 16 18:55:57.630069 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:57.630045 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dgmt6_2d9e5152-75a1-43b4-9c2b-f42acb2075df/bond-cni-plugin/0.log" Apr 16 18:55:57.653714 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:57.653688 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dgmt6_2d9e5152-75a1-43b4-9c2b-f42acb2075df/routeoverride-cni/0.log" Apr 16 18:55:57.673935 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:57.673918 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dgmt6_2d9e5152-75a1-43b4-9c2b-f42acb2075df/whereabouts-cni-bincopy/0.log" Apr 16 18:55:57.694893 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:57.694871 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dgmt6_2d9e5152-75a1-43b4-9c2b-f42acb2075df/whereabouts-cni/0.log" Apr 16 18:55:57.787041 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:57.787009 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nc622_526fd613-ae22-4771-a739-ad3226226ba3/kube-multus/0.log" Apr 16 18:55:57.852864 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:57.852797 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vnntp_e1d07798-4149-483d-a793-b43a2b14fdbf/network-metrics-daemon/0.log" Apr 16 18:55:57.872014 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:57.871991 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vnntp_e1d07798-4149-483d-a793-b43a2b14fdbf/kube-rbac-proxy/0.log" Apr 16 18:55:59.005321 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:59.005294 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r77l7_c11179d3-6347-4962-9a7f-b134abd01cf4/ovn-controller/0.log" Apr 16 18:55:59.045148 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:59.045119 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r77l7_c11179d3-6347-4962-9a7f-b134abd01cf4/ovn-acl-logging/0.log" Apr 16 18:55:59.067733 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:59.067712 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r77l7_c11179d3-6347-4962-9a7f-b134abd01cf4/kube-rbac-proxy-node/0.log" Apr 16 18:55:59.094165 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:59.094138 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r77l7_c11179d3-6347-4962-9a7f-b134abd01cf4/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:55:59.115806 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:59.115788 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r77l7_c11179d3-6347-4962-9a7f-b134abd01cf4/northd/0.log" Apr 16 18:55:59.143196 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:59.143176 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r77l7_c11179d3-6347-4962-9a7f-b134abd01cf4/nbdb/0.log" Apr 16 18:55:59.174041 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:59.174023 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r77l7_c11179d3-6347-4962-9a7f-b134abd01cf4/sbdb/0.log" Apr 16 18:55:59.273748 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:55:59.273676 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r77l7_c11179d3-6347-4962-9a7f-b134abd01cf4/ovnkube-controller/0.log" Apr 16 18:56:00.606333 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:56:00.606303 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-r8w9f_378c671e-1d04-4256-a001-a7ce2a0d3b86/network-check-target-container/0.log" Apr 16 18:56:01.515989 ip-10-0-134-55 kubenswrapper[2578]: I0416 18:56:01.515960 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-s6vmk_7430323d-b0e8-4d21-9d45-f29e8e9f976f/iptables-alerter/0.log"